Giter Site home page Giter Site logo

Comments (4)

sebkopf avatar sebkopf commented on August 25, 2024

@japhir this feature might be useful for clumpedr if you want to automate the small delta calculations before going into the specialized calcs of the clumps

from isoprocessor.

japhir avatar japhir commented on August 25, 2024

I've implemented this in little_deltas(), but mine is much less generic, relying on existing columns R45 and R45_wg, which are calculated with abundance_ratios() in the wrapper delta_values().

I first want to do background corrections, so I have to see if I want to keep it as generic as yours.

Like I said before, I might try to work with iso_file and iso_file_list in stead of just the raw data at some point, but that wrapper layer seems a bit more complicated and may be harder to understand for users, so for now I've stuck to processing the results from iso_get_raw_data() except for my clean_did_info(), which now simply relies on iso_parse_file_info() and iso_mutate_file_info().

from isoprocessor.

sebkopf avatar sebkopf commented on August 25, 2024

Yes, the background correction is definitely quite clump specific, hopefully the other functions will be useful on your end once they all are implemented as S3 generics. I'm hoping they will contribute to facilitate clumpedr support of Nu data files in addition to Thermo.

I agree with you that it's easier to work in data frame for the end-user and would second the plan to keep it that way for simplicity. I see three missing steps to make isoreader functionality useful for your end: 1) automatic retrieval of backgrounds into the $bgrd_data data frame (iso_get_bgrd_data() already works well for Nu data files but not Thermo), 2) retrieve backgrounds together with raw data in iso_get_raw_data(include_bgrd = TRUE) so your background correction function can operate on it afterwards, and 3) make iso_calculate_ratios compatible with data frames via S3. The new calculate_deltas function is already both data frame and iso_files compatible. Does that seem like it would be useful?

from isoprocessor.

japhir avatar japhir commented on August 25, 2024

For the background corrections I'm actually relying on the 253 plus' specific 'half-cup', with mass 54.5. I don't do anything with the $bgrd_data. That should probably be a little correction to undo the unwanted corrections that isodat does, right?

The "background corrections" or "pressure baseline corrections" that I'm talking about refer to using either (1) several background scan files (.scn) at different intensities to calculate a regression between initial intensity on mass 44 and the (mostly negative) background value on masses 47 through 49 or (2) the extra half-mass cup described above to characterize this negative background associated with electron backscattering from the mass 44 cup (this is currently implemented with a simple factor). This needs to happen before we calculate any deltas. It seems like the Nu instruments aren't affected by this. I'll try to implement (1) in the next few weeks before ICP, so that I can compare PBL corrections done with background scans to those done with the half-cup (which isn't available for some of my data that was measured on a MAT 253).

Yeah we should think about how to integrate more basic steps into isoprocessor and put the clumped-specific steps in clumpedr. I think ultimately we should use your functions which are applicable to all instruments, but I'm a bit reluctant because some of the stuff that's needed for generic code seems a bit complex to understand for beginners—i.e. using the excellent S3 generics means people have to write calculate_deltas.data.frame to see the source code in stead of just calculate_deltas, which would show a silly wrapper with usemethod("calculate_deltas"). In a similar vein, having my explicit calls to calculate the little delta values for all masses potentially makes it easier to understand for our user-base, who are probably not so familiar with R that they can read around what's R wrapper stuff to make it generic and what's the actual calculation. If I recall correctly I once even removed a mutate_at call in favour of a mutate call with many arguments because I thought my students would find it easier to follow.

I think that with proper documentation switching to nice code shouldn't be such a problem though... I hope that refactoring will now be a lot easier with my new very basic unit tests. What do you reckon?

from isoprocessor.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.