Giter Site home page Giter Site logo

Comments (8)

whokion avatar whokion commented on May 27, 2024 2

For the long term, we may consider to add the x-section calculations for those three models too so that both options are supported - moving to the opposite direction to Gean4! Even though the task may not be a trivial work, it should not take more than a couple of weeks by one person either. If the celeritas application does not feed enough FLOPS, ALUs are mostly idle as most of simulation tasks are memory-bounded tasks. Even though the on-the-fly calculation for each model is much slower than the pre-built table approach, it may be still worthwhile to mix models with different options together (as FLOPS is nearly free to a certain extent) and to evaluate whether there is any potential gain by mitigating the memory latency/access. At least, we will provide the more accurate simulation option and understand/measure the computing performance for different cases. Just IMHO.

from celeritas.

whokion avatar whokion commented on May 27, 2024 1

Reviewing current (10.7) Geant4 EM models of the standard EM physics list, the use of the element selector (by either SelectRandomAtom(coupe, ....) or SelectTargetAtom(couple, ...) when the element selector is built during the initialization of a model) replacing the on-the-fly calculation (by SelectRandomAtom(material, ....)) in relevant atomic-dependent models was introduced:

BetheHeitler since 10.5
SeltzerBerger since 10.5
LivermorePhotoElectric: still uses the on-the-fly option
LivermoreRayleigh: has used the element selector from long time ago, but was added to the standard EM physics since 10.4

All other EM models used in celeritas (KleinNishinaCompton, MollerBhabar, eplusAnnihilation, MSC) are atomic-independent models except that
MollerBhabar uses SelectRandomAtomNumber(matertial) which does not use the atomic x-section calculation.

from celeritas.

whokion avatar whokion commented on May 27, 2024

To clarify for Geant4 in the second bullet: Geant4 uses the "macroscopic" cross section of a process per material (integrated over all elements of a material) for sampling GPIL and selecting a process governing the next step, but does not use "the total cross section of a particle" (i.e., integrated over all processes for a given particle) per material except for the general gamma process recently added. In the description, the total cross section is equivalently used as the macroscopic cross section per material (which is fine), but should be clearly distinguished from the total cross section of the particle over processes (which was somewhat confusing to me).

As a second thought for how to deal with selecting an atom in an atomic-dependent model, we should at least implement the element cross section evaluation on-the-fly for the first pass (i.e, exactly following the Geant4 approach). There are several potential issues related to using the pre-tabulated cross sections of models: 1) physics validations as the interpolation is now a two-fold over the particle energy and the atomic number (both are usually non-linear in an atomic-dependent cross section formula), 2) a related argument on an oversimplification for the final state sampling in a physics model, 3) arguments for performance comparisons (i.e., the CPU version of celeritas is not equivalent to what Geant4 does) and performance itself (i.e., arithmetic vs. memory) even though the on-the-fly approach adds intensity for both. Of course, the tabulated approach can be explored later as an option for a certain model of which the non-linearity of the element cross section is relatively small (if the approximation is good enough for HEP, then we should also recommend Geant4 to introduce the same approach as an option).

Note on the caching the element cross section per material: this may not be efficient anyway as the sampling is also depending on the particle energy (which changes in every step for a non-elastic process even within the same volume).

from celeritas.

sethrj avatar sethrj commented on May 27, 2024

Per today's discussion:

  • Shift's assumption that microscopic cross section can always be linearly combined into macroscopic cross section isn't necessarily the case: for some processes/models, microscopic cross sections are functions of the material properties (e.g. electron density).
  • Precalculating and linearly interpolating per-material cross sections is probably OK because it's just for sampling distance-to-collision, but trying to interpolate microscopic cross sections for selecting a particular element runs an unacceptable risk of biasing the simulation.
  • With these two data points we will always need to calculate microscopic cross sections for on the fly for the selected process.
  • The most number of elements in any material in CMS is 15, so it wouldn't be a huge burden to preallocate that much storage for each thread for cross section calculations, as opposed to using a stack allocator.

from celeritas.

sethrj avatar sethrj commented on May 27, 2024

So action items are:

  • Add accessor for maximum number of elements (or nuclides once we start doing hadronic physcics) for any given material
  • Add temporary elemental cross section storage to the material track view (?) or other suitable continer
  • Add a functor (templated on an MicroXsEvaluator class) for calculating cross sections and sampling an element based on the macroscopic xs contribution of that element for the given process

from celeritas.

sethrj avatar sethrj commented on May 27, 2024

Per our discussion with Mihaly yesterday on the Excalibur call see meeting minutes Geant4 doesn't typically calculate the elemental partial macro cross sections on the fly! It precalculates and tabulates them using the G4ElementSelector which is good news for us. We should be able to pull in the microscopic cross sections (or partial macros) from Geant and sample those.

from celeritas.

whokion avatar whokion commented on May 27, 2024

Just to clarify: The reason that the PhotoElectric effect model does not use the pre-build table is ,of course, for the purpose of accuracy requirement as the linear interpolation within a fixed bin may not be good enough for most of applications due to resonances in the x-section at the low energy region.

from celeritas.

sethrj avatar sethrj commented on May 27, 2024

Hah I was just going to ask that! Thanks for the details.

from celeritas.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.