Giter Site home page Giter Site logo

gemini3d / gemini3d Goto Github PK

View Code? Open in Web Editor NEW
49.0 9.0 26.0 17.11 MB

Ionospheric fluid electrodynamic model

Home Page: https://gemini3d.github.io/gemini3d

License: Apache License 2.0

CMake 6.19% Makefile 0.20% Fortran 90.64% Python 0.84% C 0.59% C++ 1.54%
ionosphere aurora

gemini3d's Introduction

GEMINI

DOI ci oneapi-linux

The GEMINI model (Geospace Environment Model of Ion-Neutral Interactions) is a three-dimensional ionospheric fluid-electrodynamic model written (mostly) in object-oriented fortran (2008+ standard). GEMINI is used for various scientific studies including:

  • effects of auroras on the terrestrial ionosphere
  • natural hazard effects on the space environment
  • effects of ionospheric fluid instabilities on radio propagation

The detailed mathematical formulation of GEMINI is included in GEMINI-docs. A subroutine-level set of inline generated documentation describing functions of individual program units is given via source code comments which are rendered as webpages. GEMINI uses generalized orthogonal curvilinear coordinates and has been tested with dipole and Cartesian coordinates.

Generally, the Git main branch has the current development version and is the best place to start, while more thoroughly-tested releases happen regularly. Specific releases corresponding to published results are generally noted in the corresponding journal article.

Bug Reporting

The GEMINI development teams values input from our user community, particulary in the form of reporting of errors. These allow us to insure that the code functions properly for a wider range of conditions, platforms, and use cases than we are otherwise able to directly test.

Please open a GitHub Issue if you experience difficulty with GEMINI. Try to provide as much detail as possible so we can try to reproduce your error.

Platforms

Gemini is intended to be OS / CPU arch / platform / compiler agnostic. Operating system support includes: Linux, MacOS, and Windows. CPU arch support includes: Intel, AMD, ARM, IBM POWER, Cray and more. GEMINI can run on hardware ranging from a Raspberry Pi to laptop to a high-performance computing (HPC) cluster. Generally speaking one can run large 2D or modest resolution 3D simulations (less than 10 million grid points) on a quad-core workstation, with some patience.

For large 3D simulations (many tens-to-hundreds of millions of grid points), GEMINI is best run in a cluster environment or a very "large" multi-core workstation (e.g. 16 or more cores). Runtime depends heavily on the grid spacing used, which determines the time step needed to insure stability, For example we have found that a 20M grid point simulations takes about 4 hours on 72 Xeon E5 cores. 200M grid point simulations can take up to a week on 256 cores. It has generally been found that acceptable performance requires > 1GB memory per core; moreover, a large amount of storage (hundreds of GB to several TB) is needed to store results from large simulations.

Quick start

To build Gemini and run self-tests takes about 10 minutes on a laptop. Gemini3D uses several external libraries that are built as a required one-time procedure. Gemini3D works "offline" that is without internet once initially setup.

Requirements:

  • C, C++ and Fortran compiler. See compiler help for optional further details.
    • GCC โ‰ฅ 9 with OpenMPI or MPICH
    • Clang with OpenMPI
    • Intel oneAPI
    • Cray with GCC or Intel oneAPI backend
  • Python and/or MATLAB for scripting front- and back-ends
  • CMake: if your CMake is too old, download or python -m pip install cmake
  • MPI: any of OpenMPI, IntelMPI, MPICH, MS-MPI. See MPI help if needed. Without MPI, Gemini3D uses one CPU core only, which runs much more slowly than with MPI.

Gemini3D setup

Install Gemini3D prerequisite libraries. This is a one-time process used by any Gemini3D builds you do (or other programs). If your Python is too old, it will also install a local Python interpreter.

git clone https://github.com/gemini3d/external.git

cmake -Dmumps_only=yes -P external/build.cmake
# installs under ~/libgem_gnu by default

Set environment variables CMAKE_PREFIX_PATH and edit PATH environment variable as follows. On Linux add to ~/.bashrc, or on MacOS add to ~/.zshrc:

export CMAKE_PREFIX_PATH=~/libgem_gnu

Build the Gemini3D code

```sh
git clone https://github.com/gemini3d/gemini3d.git

cd gemini3d

cmake -B build -DCMAKE_PREFIX_PATH=~/libgem

cmake --build build --parallel
```

Non-default build options may be used.

GEMINI has self tests that compare the output from a "known" test problem to a reference output. To verify your GEMINI build, run the self-tests.

ctest --test-dir build

Offline HPC batch CTest

Note: some HPC systems only have internet when on a login node, but cannot run MPI simulations on the login node. Batch sessions, including interactive, may be offline. To run CTest in such an environment, download the data once from the login node:

ctest --test-dir build --preset download

then from an interactive batch session, run the tests:

ctest --test-dir build --preset offline

GEMINI Numerical Library Dependencies

For various numerical solutions Gemini relies on:

  • LAPACK
  • scalapack
  • MUMPS

For file input/output we also use:

  • hdf5
  • h5fortran
  • zlib

Running GEMINI from a Shell Environment

For basic operations the GEMINI main program simply needs to be run from the command line with arguments corresponding to to the number of processes to be used for the simulation, the location where the input files are and where the output files are to be written:

mpiexec -np <number of processors>  build/gemini.bin <output directory>

for example:

mpiexec -np 4 build/gemini.bin ~/mysim3d/arecibo

GEMINI can also be run via scripting frontends, e.g. python -m gemini3d.run -np options.

Advanced Command Line Options

By default, only the current simulation time and a few other messages are shown to keep logs uncluttered. gemini.bin command line options include:

-d | -debug : print verbosely -- could be 100s of megabytes of text on long simulation for advanced debugging.

-nooutput : do not write data to disk. This is for benchmarking file output time, as the simulation output is lost, so this option would rarely be used.

-manual_grid <# x2 images> <# x3 images> : forces the code to adopt a specific domain decomposition in x2 and x3 by using the integers given. If not specified the code will attempt to find its own x2,x3 decomposition. The number of grid points in x2 and x3 must be evenly divisible by the number of user-specified images in each direction, respectively.

-dryrun : only run the first time step, do not write any files. This can be useful to diagnose issues not seen in unit tests, particularly issues with gridding. It runs in a few seconds or less than a minute for larger sims, something that can be done before queuing an HPC job.

Running GEMINI through Scripting Environments

If you prefer to issue the GEMINI run command through a scripting environment you may do so (via python) in the following way:

  1. make a config.nml with desired parameters for an equilibrium sim.

  2. run the equilibrium sim:

    python -m gemini3d.run /path_to/config_eq.nml /path_to/sim_eq/
  3. create a new config.nml for the actual simulation and run

    python -m gemini3d.run /path_to/config.nml /path_to/sim_out/

Input file format

See Readme_input for details on how to prepare input data for GEMINI. Generally speaking there are python and MATLAB scripts available in the mat_gemini and pygemini repositories that will save data in the appropriate format once generated.

Loading and plotting output

GEMINI uses Python for essential interfaces, plotting and analysis. Matlab scripts relevant to Gemini to mat_gemini repo.

Only the essential scripts needed to setup a simple example, and plot the results are included in the main GEMINI repository. The Gemini-scripts and Gemini-examples contain scripts used for various published and ongoing analyses.

See Readme_output for a description of how to load the simulation output files and the different variable names, meanings, and units.

Computing Magnetic Field Perturbations

An auxiliary program, magcalc.f90, can be used to compute magnetic field perturbations from a complete disturbance simulation. See Readme_magcalc for a full description of how this program works.

List of other associated Readmes

  1. Readme_output - information about data included in the output files of a GEMINI simulation
  2. Readme_input - information on how input files should be prepared and formatted.
  3. Readme_compilers - details regarding various compilers
  4. Readme_cmake - cmake build options
  5. Readme_docs - information about model documentation
  6. Readme_mpi - help with mpi-related issues
  7. Readme_magcalc - some documentation for the magnetic field calculation program
  8. Readme_VEGA - information on how to deploy and run GEMINI on ERAU's VEGA HPC system.
  9. Readme_prereqs - details on how to install prerequisites on common platforms.

Known limitations and issues of GEMINI

  1. Generating equilibrium conditions can be a bit tricky with curvilinear grids. A low-res run can be done, but it will not necessary interpolate properly onto a finer grid due to some issue with the way the grids are made with ghost cells etc. A workaround is to use a slightly narrower (x2) grid in the high-res run (quarter of a degree seems to work most of the time).

  2. Magnetic field calculations on an open 2D grid do not appear completely consistent with model prototype results; although there are quite close. This may have been related to sign errors in the FAC calculations - these tests should be retried at some point.

  3. Occasionally MUMPS will throw an error because it underestimated the amount of memory needed for a solve. If this happens a workaround is to add this line of code to the potential solver being used for your simulations. If the problem persists try changing the number to 100.

    mumps_par%ICNTL(14)=50
  4. There are potentially some issues with the way the stability condition is evaluated, i.e. it is computed before the perp. drifts are solved so it is possible when using input data to overrun this especially if your target CFL number is > 0.8 or so. Some code has been added as of 8/20/2018 to throttle how much dt is allowed to change between time steps and this seems to completely fix this issue, but theoretically it could still happen; however this is probably very unlikely.

  5. Occasionally one will see edge artifacts in either the field -aligned currents or other parameters for non-periodic in x3 solves. This may be related to the divergence calculations needed for the parallel current (under EFL formulation) and for compression calculations in the multifluid module, but this needs to be investigated further... This do not appear to affect solutions in the interior of the grid domain and can probably be safely ignored if your region of interest is sufficiently far from the boundary (which is always good practice anyway).

  6. Occasionally on Windows you may get a system error code 0xc0000005 when trying to run Gemini. This typically requires rebooting the Windows computer. If this is annoying, please let us know--it happens rarely enough that we're not sure if it's a Microsoft MPI bug or something else.

gemini3d's People

Contributors

guygrubbs avatar jklenzing avatar mattzett avatar paulinchin avatar pralayraj avatar scivision avatar scottaiton avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gemini3d's Issues

crashes with HWM14 enabled

Occassionally enabling HWM14 can lead the simulation to crash after a while. This can be alleviated by changing the altitude at which the winds taper on/off to be higher than the default 150 km (see HWM interface code). A more elegant solution will eventually be needed.

Bounds Check: KHI_periodic_lowres

Similar to #77 for KHI_periodic_lowres:

output:

<snip>

/mnt/raid/ci/KHI_periodic_lowres/inputs/simsize.h5 input dimensions:    34   256   128
Target (output) grid structure dimensions:    34   256   128
Sent ICs to workers in   1.311E-01 seconds.

Initial conditions (root):
------------------------
Min/max input density:   0.00E+00   1.36E+11
Min/max input velocity:  -5.08E+02   2.66E+01
Min/max input temperature:   0.00E+00   1.49E+03
Min/max input electric potential:   8.14E+02   1.92E+04
Min/max input electric potential (full grid):   8.14E+02   3.08E+04
 Priming electric field input
init_Efieldinput: Prime electric field input files: ymd,utsec: 20130220 28500.000
 init_Efieldinput: load next file for electric field input   0.0000000000000000
 Priming precipitation input
 Computing background and priming neutral perturbation input (if used)
 Initial neutral density and temperature (from MSIS) at time:          2013           2          20   28500.000000000000       calculated in time:    0.34272900000000006
 Initial neutral winds (from HWM) at time:          2013           2          20   28500.000000000000       calculated in time:     1.8281999999999909E-002
Recomputed initial dist. fields:
     gemini    0.0000000000000000        0.0000000000000000
     gemini   -7.4999188818411971E-002  -6.4298823712095554E-002
     gemini    0.0000000000000000        0.0000000000000000
 Recomputed initial BG fields:
        0.0000000000000000        0.0000000000000000
        0.0000000000000000        0.0000000000000000
        0.0000000000000000        0.0000000000000000
 Recomputed initial drifts:
       -749.98440813173238        199.99206355270539
       -1499.9837763682397       -2.2945137763304003E-003
At line 257 of file /home/beef/code/gemini3d/build/src/numerical/calculus/calculus.f90
Fortran runtime error: Dimension 2 of array 'grad2d1_curv_alt_23' has extent 128 instead of 129

Error termination. Backtrace:
#0  0x7f15c5d2d171 in ???
#1  0x7f15c5d2dd19 in ???
#2  0x7f15c5d2e0fb in ???
#3  0x97baf0 in __calculus_MOD_grad2d1_curv_alt_23
	at /home/beef/code/gemini3d/build/src/numerical/calculus/calculus.f90:257
#4  0x89a40c in __potential_mumps_MOD_potential2d_polarization_periodic
	at /home/beef/code/gemini3d/src/numerical/potential/potential2d.f90:69
#5  0x80d7e3 in __potential_comm_MOD_potential_root_mpi_curv
	at /home/beef/code/gemini3d/src/numerical/potential/potential_root.f90:172
#6  0x7d7596 in __potential_comm_MOD_electrodynamics_curv
	at /home/beef/code/gemini3d/src/numerical/potential/potential_comm_mumps.f90:221
#7  0x419b86 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:397
#8  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#9  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[29571,1],0]
  Exit code:    2
--------------------------------------------------------------------------
[beefy:683503] 31 more processes have sent help message help-mpi-btl-openib-cpc-base.txt / no cpcs for port
[beefy:683503] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
ERROR STOP gemini.bin run failure

Error termination. Backtrace:
#0  0x7f4d1a272171 in ???
#1  0x7f4d1a272d19 in ???
#2  0x7f4d1a273e8e in ???
#3  0x446be1 in gemini3d_run
	at /home/beef/code/gemini3d/src/utils/gemini3d_run.f90:52
#4  0x446c34 in main
	at /home/beef/code/gemini3d/src/utils/gemini3d_run.f90:5
<end of output>
Test time =   4.56 sec
----------------------------------------------------------
Test Failed.
"run_bounds_check:KHI_periodic_lowres" end time: Oct 27 17:27 EDT
"run_bounds_check:KHI_periodic_lowres" time elapsed: 00:00:04
----------------------------------------------------------

segfault with workers on check_finite_input

until recently we had in src/gemini.f90

if (myid==0) check_finite_input(..)
when check_finite was applied to all workers, segfaults would instantly result on dryrun on Linux with GCC.
Not crashing: gcc 10 on Windows, Intel oneAPI on Linux (centos 7)
For now putting if(myid==0) back in. Would be nice to understand proper long-term solution.

3D CI tests not running; file not found

 ctest -V -R gemini:hdf5:3d_fang:dryrun

Fails with the error:

9: Test command: /usr/local/Frameworks/Python.framework/Versions/3.7/bin/python3.7 "/Users/zettergm/Projects/GEMINI/scripts/run_test.py" "3d_fang" "-mpiexec" "/usr/local/bin/mpiexec" "/Users/zettergm/Projects/GEMINI/build/gemini.bin" "/Users/zettergm/Projects/GEMINI/build/test3d_fang" "-dryrun"
9: Test timeout computed to be: 60
9: Traceback (most recent call last):
9:   File "/Users/zettergm/Projects/GEMINI/scripts/run_test.py", line 134, in <module>
9:     dryrun=P.dryrun,
9:   File "/Users/zettergm/Projects/GEMINI/scripts/run_test.py", line 86, in run_test
9:     shutil.copy2(z["dir"] / cfg["indat_size"], input_dir)
9:   File "/usr/local/Cellar/python/3.7.7/Frameworks/Python.framework/Versions/3.7/lib/python3.7/shutil.py", line 266, in copy2
9:     copyfile(src, dst, follow_symlinks=follow_symlinks)
9:   File "/usr/local/Cellar/python/3.7.7/Frameworks/Python.framework/Versions/3.7/lib/python3.7/shutil.py", line 120, in copyfile
9:     with open(src, 'rb') as fsrc:
9: FileNotFoundError: [Errno 2] No such file or directory: '/Users/zettergm/Projects/GEMINI/tests/data/test3d_fang/tests/data/test3d_fang/inputs/simsize.h5'
4/4 Test  #9: gemini:hdf5:3d_fang:dryrun .......***Failed    0.35 sec

build system no longer works with -Dhdf5=off

cmake -B build -Dhdf5=off
cmake --build build -j

/ihome/home2/zettergm/zettergmdata/Projects/gemini3d/src/io/milestone.f90:47:28:

flagmilestone=h5f%exists('/nsall')
1
Error: 'exists' at (1) is not a member of the 'hdf5_file' structure

Bug in neutral input/interpolation code

Somehow the program is interpreting the neutral data as smaller than it should be... This problem does not manifest in commit a60e56a so it was introduced sometime in the last eight months probably during one of the neutral refactors...

MPI auto-partitioning

There are 3 separate issues that exist with auto-partitioning, that currently hang or waste CPU time or give errors.

  • need to detect when total MPI partitions doesn't match total MPI workers, and error stop in Fortran. Simulation can deadlock otherwise or waste a lot of CPU at least.
  • need to detect when auto-partition has degenerated to a single MPI image and error stop when MPIexec has been used NOTE we still must preserve the ability to run a single Gemini.bin without mpiexec/mpirun for new users. This is handled in gemini3d.run Fortran exe by not invoking mpiexec unless more than one MPI image is used.
  • Need to improve auto-partition function, that must work the same in Fortran, Matlab and Python, that based on total MPI images, assigns an efficient partition of the program, with 1 or even number of images in x2,x3. Currently, the Fortran code is optimized but Matlab and Python are not yet. Hence I use gemini3d.run Fortran executable at this time. -- I've opened issues in MatGemini and PyGemini for this--perhaps they should just use Fortran gemini3d.run

Input formats from MAGIC needs to be documented in READMEs

There needs to be a clear statement in the documentation about what the parameter ordering is in the input files (in the case of raw input) and how these arrays are permuted with respect to altitude, longitude, and latitude. The size variables in simsize need to also be defined - i.e. the ordering of these needs to be clearly specified.

Sudden halt (MPI error?) of simulation before completion

Seems to happen when running tohoku20113D_medres example; I've not noticed this before. Basically the simulation terminates prematurely usually one of the workers has gotten and signal 9. Sometimes concurrent with output.

PDEelliptic bounds warning

the new GCC 9.3.0 gives a new warning for submodule of PDEelliptic. all are in elliptic2d.f90, a submodule of PDEelliptic.

It appears to be a valid array bounds warning--for example, v2 and v3 appear to be called from potential_root.f90, declared as:

real(wp), dimension(1:size(Phiall,2),1:size(Phiall,3)) :: v2slaball,v3slaball   !stores drift velocs. for pol. current

It's likely these issues existed for a long time, but we just didn't know about it till the new compiler gave stricter warnings. It's likely to be related to Intel segfaulting #7

> cmake --build . --clean-first
[1/1] Cleaning all built files...
Cleaning... 621 files.
[249/303] Building Fortran object src/numerical/potential/CMakeFiles/PDEelliptic.dir/elliptic2d.f90.obj
../src/numerical/potential/elliptic2d.f90:660:20:

  590 |   do ix2=1,lx2
      |              2
......
  660 |       coeff=-1d0*Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                    1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:660:34:

  590 |   do ix2=1,lx2
      |              2
......
  660 |       coeff=-1d0*Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                                  1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:699:9:

  590 |   do ix2=1,lx2
      |              2
......
  699 |       Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |         1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:699:23:

  590 |   do ix2=1,lx2
      |              2
......
  699 |       Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                       1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:725:20:

  590 |   do ix2=1,lx2
      |              2
......
  725 |       coeff=-1d0*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                    1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:725:34:

  590 |   do ix2=1,lx2
      |              2
......
  725 |       coeff=-1d0*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                                  1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:725:98:

  590 |   do ix2=1,lx2
      |              2
......
  725 |       coeff=-1d0*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                                                                                                  1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:746:33:

  590 |   do ix2=1,lx2
      |              2
......
  746 |       b(iPhi)=b(iPhi)+coeff*Phi0(ix2-1,ix3)    !BC's and pol. time deriv.
      |                                 1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:748:15:

  590 |   do ix2=1,lx2
      |              2
......
  748 |       coeff=Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )+ &
      |               1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:748:29:

  590 |   do ix2=1,lx2
      |              2
......
  748 |       coeff=Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )+ &
      |                             1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:748:91:

  590 |   do ix2=1,lx2
      |              2
......
  748 |       coeff=Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )+ &
      |                                                                                           1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:749:9:

  590 |   do ix2=1,lx2
      |              2
......
  749 |       Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |         1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:749:23:

  590 |   do ix2=1,lx2
      |              2
......
  749 |       Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                       1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:749:87:

  590 |   do ix2=1,lx2
      |              2
......
  749 |       Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                                                                                       1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:778:16:

  590 |   do ix2=1,lx2
      |              2
......
  778 |       (-1d0)*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )
      |                1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:778:30:

  590 |   do ix2=1,lx2
      |              2
......
  778 |       (-1d0)*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )
      |                              1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:778:92:

  590 |   do ix2=1,lx2
      |              2
......
  778 |       (-1d0)*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )
      |                                                                                            1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:828:15:

  590 |   do ix2=1,lx2
      |              2
......
  828 |       coeff=Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |               1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:828:29:

  590 |   do ix2=1,lx2
      |              2
......
  828 |       coeff=Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                             1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:867:16:

  590 |   do ix2=1,lx2
      |              2
......
  867 |       (-1d0)*Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:867:30:

  590 |   do ix2=1,lx2
      |              2
......
  867 |       (-1d0)*Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                              1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:119:24:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  119 |       coeff=-1d0*Cm(ix2,ix3-1)*v2(ix2,ix3-1)/ &
      |                        1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:119:38:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  119 |       coeff=-1d0*Cm(ix2,ix3-1)*v2(ix2,ix3-1)/ &
      |                                      1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:122:37:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  122 |         b(iPhi)=b(iPhi)-coeff*Vminx3(ix2-1)
      |                                     1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:134:24:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  134 |       coeff=-1d0*Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3-1)*dx3iall(ix3-1)) )
      |                        1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:134:38:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  134 |       coeff=-1d0*Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3-1)*dx3iall(ix3-1)) )
      |                                      1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:134:98:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  134 |       coeff=-1d0*Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3-1)*dx3iall(ix3-1)) )
      |                                                                                                  1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:149:19:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  149 |       coeff=Cm(ix2,ix3-1)*v2(ix2,ix3-1)/ &
      |                   1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:149:33:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  149 |       coeff=Cm(ix2,ix3-1)*v2(ix2,ix3-1)/ &
      |                                 1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:164:20:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  164 |       coeff=-1d0*Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                    1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:164:34:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  164 |       coeff=-1d0*Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                                  1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:167:37:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  167 |         b(iPhi)=b(iPhi)-coeff*Vminx2(ix3-1)
      |                                     1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:186:37:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  186 |       b(iPhi)=b(iPhi)+coeff*Phi0(ix2,ix3-1)
      |                                     1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:189:19:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  189 |       coeff=Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3)*dx3iall(ix3-1)) )+ &
      |                   1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:189:33:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  189 |       coeff=Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3)*dx3iall(ix3-1)) )+ &
      |                                 1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:189:91:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  189 |       coeff=Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3)*dx3iall(ix3-1)) )+ &
      |                                                                                           1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:190:13:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  190 |       Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3-1)*dx3iall(ix3-1)) )
      |             1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:190:27:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  190 |       Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3-1)*dx3iall(ix3-1)) )
      |                           1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:190:87:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  190 |       Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3-1)*dx3iall(ix3-1)) )
      |                                                                                       1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:195:9:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  195 |       Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |         1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:195:23:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  195 |       Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                       1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:206:37:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  206 |         b(iPhi)=b(iPhi)-coeff*Vmaxx2(ix3-1)
      |                                     1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:218:20:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  218 |       coeff=-1d0*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                    1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:218:34:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  218 |       coeff=-1d0*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                                  1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:218:98:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  218 |       coeff=-1d0*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                                                                                                  1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:240:33:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  240 |       b(iPhi)=b(iPhi)+coeff*Phi0(ix2-1,ix3)    !BC's and pol. time deriv.
      |                                 1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:242:15:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  242 |       coeff=Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )+ &
      |               1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:242:29:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  242 |       coeff=Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )+ &
      |                             1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:242:91:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  242 |       coeff=Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )+ &
      |                                                                                           1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:243:9:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  243 |       Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |         1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:243:23:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  243 |       Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                       1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:243:87:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  243 |       Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2-1)*dx2iall(ix2-1)) )
      |                                                                                       1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:248:13:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  248 |       Cm(ix2,ix3-1)*v2(ix2,ix3-1)/ &
      |             1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:248:27:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  248 |       Cm(ix2,ix3-1)*v2(ix2,ix3-1)/ &
      |                           1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:272:16:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  272 |       (-1d0)*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )
      |                1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:272:30:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  272 |       (-1d0)*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )
      |                              1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:272:92:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  272 |       (-1d0)*Cm(ix2-1,ix3)*v2(ix2-1,ix3)/( (dx2all(ix2)+dx2all(ix2+1))*(dx2all(ix2)*dx2iall(ix2-1)) )
      |                                                                                            1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:276:20:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  276 |       (-1d0)*Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3)*dx3iall(ix3-1)) )
      |                    1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:276:34:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  276 |       (-1d0)*Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3)*dx3iall(ix3-1)) )
      |                                  1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:276:92:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  276 |       (-1d0)*Cm(ix2,ix3-1)*v3(ix2,ix3-1)/( (dx3all(ix3)+dx3all(ix3+1))*(dx3all(ix3)*dx3iall(ix3-1)) )
      |                                                                                            1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:298:20:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  298 |       (-1d0)*Cm(ix2,ix3-1)*v2(ix2,ix3-1)/ &
      |                    1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:298:34:

   84 | loopx3: do ix3=1,lx3
      |                    2
......
  298 |       (-1d0)*Cm(ix2,ix3-1)*v2(ix2,ix3-1)/ &
      |                                  1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:320:15:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  320 |       coeff=Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |               1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:320:29:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  320 |       coeff=Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                             1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:350:16:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  350 |       (-1d0)*Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:350:30:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  350 |       (-1d0)*Cm(ix2-1,ix3)*v3(ix2-1,ix3)/ &
      |                              1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
../src/numerical/potential/elliptic2d.f90:376:37:

   85 |   loopx2: do ix2=1,lx2
      |                      2
......
  376 |         b(iPhi)=b(iPhi)-coeff*Vmaxx3(ix2-1)
      |                                     1
Warning: Array reference at (1) out of bounds (0 < 1) in loop beginning at (2) [-Wdo-subscript]
[303/303] Linking Fortran executable gemini.bin.exe

Non finite results for first time step following priming

When running the code there will sometimes be a non-finite result during the first time step causing the simulation to stop. Most often it will run correctly if you just reissue the same command (i.e. just run mpirun again with same input etc.). 1e9ad70 (before restarting and priming were added) seems to work fine as long as the input code is adjusted for new hdf5 variable names (nsall, etc.)

Windows MinGW 9.3.0 only: possible issue of duplicate file writes, intermittent crashes

Need to confirm details. Observing that on 2d_fang and other tests, that the output files are possibly being written repetitively. This is obviously not desired, and intermittently crashes images.
This is seen in duplicated console text printed for for each output time step.

1: Current time 2013-02-20 18009.897500; dt=0.960020
1: Current time 2013-02-20 18009.897500; dt=0.960020
1: Current time 2013-02-20 18009.897500; dt=0.960020
1: Current time 2013-02-20 18009.897500; dt=0.960020
1: Current time 2013-02-20 18009.897500; dt=0.960020
1: Current time 2013-02-20 18020.097397; dt=1.070465
1: Current time 2013-02-20 18020.097397; dt=1.070465
1: Current time 2013-02-20 18020.097397; dt=1.070465
1: Current time 2013-02-20 18020.097397; dt=1.070465
1: Current time 2013-02-20 18020.097397; dt=1.070465
1: Current time 2013-02-20 18031.303462; dt=1.155498
1: Current time 2013-02-20 18031.303462; dt=1.155498

Also may have console print for MPI

gemini\build\gemini.bin.exe Process:       0 /     -1 
gemini\build\gemini.bin.exe Process:       0 /     -1 
gemini\build\gemini.bin.exe Process:       0 /     -1 

This is NOT happening for Linux with current master 3ebbb02 but does appear to happen back to v0.5.1 and possibly earlier on Windows MinGW only.

Grid deallocation may not happen

Here:

!    deallocate(x%e1,x%e2,x%e3)    !handled by clear_unitvecs (assuming netural perturbations are used)
!    deallocate(x%er,x%etheta,x%ephi)


the explicit deallocation of grid members may not happen at the end of the simulation if neutral perturbations are not used in the simulation.  

latest main branch build fails to find HDF5

This is on an Intel Mac:

-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Performing Test HDF5_C_links
-- Performing Test HDF5_C_links - Success
-- Performing Test HDF5_Fortran_links
-- Performing Test HDF5_Fortran_links - Failed
CMake Error at /usr/local/Cellar/cmake/3.21.2/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:230 (message):
Could NOT find HDF5 (missing: HDF5_links) (found version "1.10.4")
Call Stack (most recent call first):
/usr/local/Cellar/cmake/3.21.2/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:594 (_FPHSA_FAILURE_MESSAGE)
cmake/Modules/FindHDF5.cmake:638 (find_package_handle_standard_args)
CMakeLists.txt:28 (find_package)

Ubuntu ARM64 segfaults while trying to download test data

test 1
Start 1: mini2dns_fang:download

1: Test command: /home/parallels/.local/bin/cmake "-Dname=mini2dns_fang" "-Doutdir:PATH=/home/parallels/Projects/gemini3d/build/mini2dns_fang" "-Drefroot:PATH=/home/parallels/Projects/gemini3d/test_data" "-P" "/home/parallels/Projects/gemini3d/cmake/test/download.cmake"
1: Test timeout computed to be: 180
1: -- TLS status: NOTFOUND
1: CMake Warning at /home/parallels/Projects/gemini3d/cmake/CheckTLS.cmake:21 (message):
1: TLS seems to be broken on your system. Download will probably fail.
1: NOTFOUND
1: Call Stack (most recent call first):
1: /home/parallels/Projects/gemini3d/cmake/CheckTLS.cmake:28 (check_tls)
1: /home/parallels/Projects/gemini3d/cmake/test/download.cmake:3 (include)
1:
1:
1: -- TLS status: NOTFOUND
1: CMake Warning at /home/parallels/Projects/gemini3d/cmake/CheckTLS.cmake:21 (message):
1: TLS seems to be broken on your system. Download will probably fail.
1: NOTFOUND
1: Call Stack (most recent call first):
1: /home/parallels/Projects/gemini3d/cmake/test/download.cmake:4 (check_tls)
1:
1:
1: -- mini2dns_fang: missing hash file, seeing if we need to download and/or extract
1: -- DOWNLOAD: https://www.dropbox.com/s/lk3ib91eimjub9f/mini2dns_fang.zst?dl=1 => /home/parallels/Projects/gemini3d/test_data/mini2dns_fang.zst sha256: 6f7682d62a6456634a4e07b8c2ebfd8a54cbb2bb4f306ab2e3f5418a6e51ffda
1/1 Test #1: mini2dns_fang:download ...........***Exception: SegFault 0.40 sec

Restart code requires input and output file formats to be h5

Even if the milestones are written as .h5 the find_milestone() subroutine uses the input file format to determine whether restarting is possible. So if you happen to use binary input and h5 output (restarting should be possible in principle) it will refuse to restart due to binary file inputs... Generally, we should probably all just migrate to h5 for everything so we don't have to keep fighting with formats, etc...

Readme_output details for Python

I started adding more info about how to load simulation results, but I was only able to do this for MATLAB; someone needs to add directions for Python. A related issue is that the Matlab scripts referred to are actually in a different repository (gemini-matlab). I'm not sure how to square this with the fact that it seems like the Readme for those needs to be at least linked in the main repo and should have MATLAB and Python instructions appearing side-by-side.

GCC-7 expand_envvar

Ubuntu 18.04 user with GCC 7 reported a unit test failure that is reproducible on CI with same setup (GCC 7.5.0).
While Gemini3D supports factory-maintained compilers (currently, GCC >= 9), since GCC 7 still works with all other parts of Gemini, it may be worthwhile to find out if a workaround is possible quickly, or if this feature is simply not available for GCC 7.

Is the problem using get_environment_variable()?

17/30 Test #47: unit:expand_envvar .....................***Failed    0.00 sec
ERROR STOP expand_envar: expected abchelloxyz, got abchello   

diagnosis notes

Recommend SSHing into a system that has GCC 7 already, as it could be a bug on 1 or 2 things, too cumbersome to diagnose via CI.
Underlying code is in gemini3d/src/io/{config,test_expand_envvar}.f90

Extraneous diagnostic printing

! Accessing root-only grid information in divergence function grad2D1_curv_alt_23

For some reason this is not suppressed but should be unless debug flag is thrown

Check if kchem influences GLOW results

In glow_run.F90, change kchem=4 to kchem=1 (or 2 or 3) and compare model run speeds and results to see if performance is improved without altering output.

Bounds Check: GDI_periodic_lowres

GDI_periodic_lowres fails on bound check with -dryrun, with output below:

6/148 Testing: run_bounds_check:GDI_periodic_lowres
6/148 Test: run_bounds_check:GDI_periodic_lowres
Command: "/home/beef/code/gemini3d/build/Debug/gemini3d.run.debug" "/mnt/raid/ci/GDI_periodic_lowres" "-mpiexec" "/usr/lib64/openmpi/bin/mpiexec" "-dryrun"
Directory: /home/beef/code/gemini3d/build/Release
"run_bounds_check:GDI_periodic_lowres" start time: Oct 27 17:21 EDT
Output:
----------------------------------------------------------
gemini3d.run: detected CPU count: 32
MPI partition of lx2, lx3: 184 48 is lid2, lid3: 4 8
MPI images: 32

<snip>

/mnt/raid/ci/GDI_periodic_lowres/inputs/simsize.h5 input dimensions:    34   184    48
Target (output) grid structure dimensions:    34   184    48
Sent ICs to workers in   3.172E-02 seconds.

Initial conditions (root):
------------------------
Min/max input density:   0.00E+00   1.46E+11
Min/max input velocity:  -5.52E+02   2.75E+01
Min/max input temperature:   0.00E+00   1.49E+03
Min/max input electric potential:   0.00E+00   0.00E+00
Min/max input electric potential (full grid):   0.00E+00   0.00E+00
 Priming electric field input
init_Efieldinput: Prime electric field input files: ymd,utsec: 20130220 18000.000
 init_Efieldinput: load next file for electric field input   0.0000000000000000
           0  using Lagrangian grid moving at:     500.00000745058060        0.0000000000000000
 Priming precipitation input
 Computing background and priming neutral perturbation input (if used)
 Initial neutral density and temperature (from MSIS) at time:          2013           2          20   18000.000000000000       calculated in time:     5.5562000000000000E-002
 Initial neutral winds (from HWM) at time:          2013           2          20   18000.000000000000       calculated in time:     2.8360000000000052E-003
Recomputed initial dist. fields:
     gemini    0.0000000000000000        0.0000000000000000
     gemini   -0.0000000000000000       -0.0000000000000000
     gemini    0.0000000000000000        0.0000000000000000
 Recomputed initial BG fields:
        0.0000000000000000        0.0000000000000000
        0.0000000000000000        0.0000000000000000
       -2.5000000372529030E-002  -2.5000000372529030E-002
 Recomputed initial drifts:
       -499.99915044276207       0.12063574780673696
       -249.69821805881347        68.445269778559336
At line 257 of file /home/beef/code/gemini3d/build/src/numerical/calculus/calculus.f90
Fortran runtime error: Dimension 2 of array 'grad2d1_curv_alt_23' has extent 48 instead of 49

Error termination. Backtrace:
#0  0x7fe0586c2171 in ???
#1  0x7fe0586c2d19 in ???
#2  0x7fe0586c30fb in ???
#3  0x97baf0 in __calculus_MOD_grad2d1_curv_alt_23
	at /home/beef/code/gemini3d/build/src/numerical/calculus/calculus.f90:257
#4  0x89a40c in __potential_mumps_MOD_potential2d_polarization_periodic
	at /home/beef/code/gemini3d/src/numerical/potential/potential2d.f90:69
#5  0x80d7e3 in __potential_comm_MOD_potential_root_mpi_curv
	at /home/beef/code/gemini3d/src/numerical/potential/potential_root.f90:172
#6  0x7d7596 in __potential_comm_MOD_electrodynamics_curv
	at /home/beef/code/gemini3d/src/numerical/potential/potential_comm_mumps.f90:221
#7  0x419b86 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:397
#8  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#9  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[30087,1],0]
  Exit code:    2
--------------------------------------------------------------------------
[beefy:682987] 31 more processes have sent help message help-mpi-btl-openib-cpc-base.txt / no cpcs for port
[beefy:682987] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
ERROR STOP gemini.bin run failure

Error termination. Backtrace:
#0  0x7fe8e59ab171 in ???
#1  0x7fe8e59abd19 in ???
#2  0x7fe8e59ace8e in ???
#3  0x446be1 in gemini3d_run
	at /home/beef/code/gemini3d/src/utils/gemini3d_run.f90:52
#4  0x446c34 in main
	at /home/beef/code/gemini3d/src/utils/gemini3d_run.f90:5
<end of output>
Test time =   3.07 sec
----------------------------------------------------------
Test Failed.
"run_bounds_check:GDI_periodic_lowres" end time: Oct 27 17:21 EDT
"run_bounds_check:GDI_periodic_lowres" time elapsed: 00:00:03
----------------------------------------------------------

Intel Windows unit tests

test_potential2d: will segfault without traceback, even in debug, on elliptic2d_cart call.

However, ifort windows is capable of running the main 4 tests (2d/3d, fang/glow).

Full simulation tests need to clear out prior data

If you run ctest more than once it will fail the second (and subsequent) times on the full simulation tests (e.g. test2dns_fang) because of the restart functionality which will try to start it from the last file that it finds, which is the final simulation time. I think ctest needs to clear the output data before it runs so every time you run ctest it will start the simulations from the beginning.

error in Fortran MPI image count check

Currently numerical/grid/grid.f90:grid_check() does some basic sanity checks. However a lot of bad cases can get through, causing a non-obvious error like putting NaNs in plasma variables or MUMPS failing to solve. Normally, the Python front-end will only give "good" number of MPI images. However if a user manually specifies the number of MPI images, grid_check() often doesn't catch the problem, and so we get NaN check errors or MUMPS memory errors.

Examples triggering error

there are many more cases that will fail, this is just one example of each.

NaN in v1

python scripts/run_test.py 2dew_fang mpiexec.exe build/gemini.bin.exe build/test2dew_fang -np 3

(now fixed)

mumps error

python scripts/run_test.py 2dew_fang mpiexec.exe build/gemini.bin.exe build/test2dew_fang -np 7

intermittent test failures due to reference data?

Intermittently when setting up a "new" Gemini build directory, I notice that test2dew_fang will fail with the error message below. Deleting the directory build/test2dew_fang fixes the error, because this causes the test2dew_fang.zip to be reextracted on next test run. I haven't looked into how this failure occurs or if it was just old data on my PC.


: Test command: \miniconda3\python.exe "code/gemini3d/scripts/run_test.py" "2dew_fang" "-mpiexec" "C:/Program Files/Microsoft MPI/Bin/mpiexec.exe" "/code/gemini3d/buildmpi/gemini.bin.exe" "/code/gemini3d/buildmpi/test2dew_fang" "-dryrun"
5: Test timeout computed to be: 60
5:            4 MPI processes detected
5: \code\gemini3d\buildmpi\gemini.bin.exe Process:       2 /      3 at 20200825T134121.236
5: \code\gemini3d\buildmpi\gemini.bin.exe Process:       0 /      3 at 20200825T134121.236
5: \code\gemini3d\buildmpi\gemini.bin.exe Process:       3 /      3 at 20200825T134121.236
5: \code\gemini3d\buildmpi\gemini.bin.exe Process:       1 /      3 at 20200825T134121.236
5:  ******************** input config ****************
5: simulation directory: \code\gemini3d\buildmpi\test2dew_fang
5:                             start year-month-day:    2013-02-20
5:                                       start time:   18000.000
5:                                         duration:     300.000
5:                                     output every:      60.000
5: gemini.f90: using input data files:
5: \code\gemini3d\buildmpi\test2dew_fang/inputs/simsize.h5
5: \code\gemini3d\buildmpi\test2dew_fang/inputs/simgrid.h5
5: \code\gemini3d\buildmpi\test2dew_fang/inputs/initial_conditions.h5
5:  no neutral disturbance specified.
5: Precipitation file input cadence (s):       5.000
5:  Precipitation file input source directory:  \code\gemini3d\buildmpi\test2dew_fang/inputs/precip/
5:  Electric field file input cadence (s):     1.0000000000000000
5:  Electric field file input source directory:  \code\gemini3d\buildmpi\test2dew_fang/inputs/Efield/
5:  GLOW disabled
5:  EIA disabled
5:  Variable background neutral atmosphere disabled.
5:  Background precipitation has total energy flux and energy:     1.0000000000000000E-003   3000.0000000000000
5:  Parallel current calculation enabled.
5:  Inertial capacitance calculation type:             0
5:  Diffusion solve type:             2
5:  Milestone output disabled.
5:  Gravitaional drift terms disabled.
5:  **************** end input config ***************
5: process grid (Number MPI processes) x2, x3:       1     4
5:  grid_size_root: full grid size:            48          40           1
5: Process:     1 at process grid location:     0     1
5: process grid (Number MPI processes) x2, x3:       1     4
5: process grid (Number MPI processes) x2, x3:       1     4
5: Process:     0 at process grid location:     0     0
5: Process:     2 at process grid location:     0     2
5:  2D run: **SWAP** x2 and x3 dims
5: Grid slab size:      48     1    10
5: process grid (Number MPI processes) x2, x3:       1     4
5: Process:     3 at process grid location:     0     3
5:  Starting grid input from file: \code\gemini3d\buildmpi\test2dew_fang/inputs/simgrid.h5
5:  2D grid: **PERMUTE** x2 and x3 dimensions
5:  Exchanging grid spacings...
5:  Computing subdomain spacing...
5:  Dealing with metric factors...
5:  Sending gravity, etc...
5:  Now sending unit vectors...
5:  Done sending slabbed variables to workers...
5:  Done computing null grid points...  Process:             3  has:             0
5:  Done computing null grid points...  Process:             2  has:             0
5:  Done computing null grid points...  Process:             1  has:             0
5:  Done computing null grid points...  Process:             0  has:             0
5:  Last milestone (if any)found in output directory:          2013           2          20   18300.000000000000      \code\gemini3d\buildmpi\test2dew_fang/20130220_18300.000000.h5
5:  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
5:  ! Restarting simulation from time:          2013           2          20   18300.000000000000
5:  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
5:  Treating the following file as initial conditions:  \code\gemini3d\buildmpi\test2dew_fang/20130220_18300.000000.h5
5:   full duration:     300.00000000000000      ; remaining simulation time:     0.0000000000000000
5: \code\gemini3d\buildmpi\test2dew_fang/inputs/simsize.h5 input dimensions:    48    40     1
5: Target (output) grid structure dimensions:    48     1    40
5:  2D simulation: **SWAP** x2/x3 dims and **PERMUTE** input arrays
5:  ERROR:h5fortran:shape_check: shape mismatch /nsall =                    48                    1                   40                    7   variable shape =                   48                   40                    1
   7
5: ERROR STOP
5:

run_bound_check failures (GemCI)

First build Gemini3D in a fresh build directory, on branch "ide_merge"

git checkout ide_merge
cmake -B build --preset debug
cmake --build build

To build with multiple compilers, use different build directories--we'll refer to these directories from GemCI.
Example: different build dir for Intel oneAPI:

# source Intel oneAPI setvars
cmake -B buildi --preset debug   # used "buildi"
cmake --build buildi

GemCI commands

The extended CI for Gemini3D is in repo GemCI.

You can use multiple build dirs to refer to different compilers. Assuming GCC is in "gemini3d/build" and Intel oneAPI in "gemin3d/buildi":

(under gemci/)

cmake -B build --preset default -DGEMINI_ROOT=../gemini3d/build  # GCC
cmake -B buildi --preset default -DGEMINI_ROOT=../gemini3d/buildi  # intel

CentOS 8 Stream GCC

ctest --test-dir build -R run_bounds_check:tohoku20113D_lowres_axineu -V

fails, regardless of GCC 8.5, GCC 11, etc. OK on CentOS 8.4 on Intel Mac VM.
Fails with non-finite check failure on Ns or vs1 on all workers.

CentOS 8 Stream Intel oneAPI 2021.4

This is probably a better place to look for the source of the problem affecting GCC.

ctest --test-dir build -R run_bounds_check:tohoku20113D_lowres_axineu -V
test 149
    Start 149: run_bounds_check:tohoku20113D_lowres_3Dneu

149: Test command: /home/beef/code/gemini3d/buildi/gemini3d.run.debug "/mnt/raid/ci/tohoku20113D_lowres_3Dneu" "-mpiexec" "/home/beef/intel/oneapi/mpi/2021.4.0/bin/mpiexec" "-dryrun"
149: Environment variables:
149:  GEMINI_CIROOT=/mnt/raid/ci
149: Test timeout computed to be: 180
149: gemini3d.run: detected CPU count: 32
149: MPI partition of lx2, lx3: 128 48 is lid2, lid3: 4 8
149: MPI images: 32
149:  "/home/beef/intel/oneapi/mpi/2021.4.0/bin/mpiexec" -n 32 /home/beef/code/gemini
149:  3d/buildi/gemini.bin /mnt/raid/ci/tohoku20113D_lowres_3Dneu  -dryrun
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      11 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      17 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      12 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      21 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      24 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      26 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      29 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      18 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      23 /     31 at 20211109T133345.159
149:           32 MPI processes detected
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       5 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       6 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      15 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      19 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      27 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      28 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       2 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      16 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      20 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      25 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      31 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      30 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       3 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       4 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       7 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      13 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      14 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      22 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       1 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       9 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:      10 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       8 /     31 at 20211109T133345.159
149: /home/beef/code/gemini3d/buildi/gemini.bin Process:       0 /     31 at 20211109T133345.160
149:  ******************** input config ****************
149: simulation directory: /mnt/raid/ci/tohoku20113D_lowres_3Dneu
149:                             start year-month-day:    2011-03-11
149:                                       start time:   35100.000
149:                                         duration:    1790.000
149:                                     output every:     125.000
149: gemini.f90: using input data files:
149: /mnt/raid/ci/tohoku20113D_lowres_3Dneu/inputs/simsize.h5
149: /mnt/raid/ci/tohoku20113D_lowres_3Dneu/inputs/simgrid.h5
149: /mnt/raid/ci/tohoku20113D_lowres_3Dneu/inputs/initial_conditions.h5
149:  Neutral disturbance mlat,mlon:     29.0667000000000
149:    209.850100000000
149:  Neutral disturbance cadence (s):     5.00000000000000
149:  Neutral grid resolution (m):     8333.30000000000        5000.00000000000
149:  Neutral disturbance data files located in directory:
149:  /mnt/raid/ci/acoustic3D_cartesian_neutrals
149:  no precipitation specified
149:  no Efield specified
149:  GLOW disabled
149:  MSISE00 enabled for neutral atmosphere calculations.
149:  EIA disabled
149:  Variable background neutral atmosphere disabled.
149:  Background precipitation has total energy flux and energy:
149:   1.000000000000000E-003   3000.00000000000
149:  Parallel current calculation enabled.
149:  Inertial capacitance calculation type:             0
149:  Diffusion solve type:             2
149:  Milestone output selected; cadence (every nth output) of:            10
149:  Gravitaional drift terms disabled.
149:  Lagrangian grid disabled
149:  **************** end input config ***************
149:  grid_size_root: full grid size:           512         128          48
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:0 at process grid location: 0 0
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:1 at process grid location: 1 0
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:2 at process grid location: 2 0
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:3 at process grid location: 3 0
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:4 at process grid location: 0 1
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:5 at process grid location: 1 1
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:8 at process grid location: 0 2
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:10 at process grid location: 2 2
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:13 at process grid location: 1 3
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:16 at process grid location: 0 4
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:18 at process grid location: 2 4
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:6 at process grid location: 2 1
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:7 at process grid location: 3 1
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:9 at process grid location: 1 2
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:11 at process grid location: 3 2
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:12 at process grid location: 0 3
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:14 at process grid location: 2 3
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:15 at process grid location: 3 3
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:17 at process grid location: 1 4
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:19 at process grid location: 3 4
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:20 at process grid location: 0 5
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:21 at process grid location: 1 5
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:22 at process grid location: 2 5
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:23 at process grid location: 3 5
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:26 at process grid location: 2 6
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:27 at process grid location: 3 6
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:24 at process grid location: 0 6
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:25 at process grid location: 1 6
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:28 at process grid location: 0 7
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:29 at process grid location: 1 7
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:30 at process grid location: 2 7
149:  get_subgrid_size: 3D run
149: process grid (Number MPI processes) x2, x3:  4 8
149: Process:31 at process grid location: 3 7
149:  get_subgrid_size: 3D run
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149:   Detected dipole grid...
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149: make_dipolemesh:  allocating space for grid of size: 516 36 10
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell centers to spherical coordinates...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in q...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  converting cell interfaces in p...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  metric factors for cell centers...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  geographic coordinates from magnetic...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell q-interfaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  metric factors for cell p-intefaces...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  metric factors for cell phi-interfaces...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  spherical ECEF unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  dipole unit vectors...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  magnetic fields...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  gravity...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  base type-bound procedure calls...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149:  make_dipolemesh:  inclination angle...
149: /mnt/raid/ci/tohoku20113D_lowres_3Dneu/inputs/simsize.h5 input dimensions:   512   128    48
149: Target (output) grid structure dimensions:   512   128    48
149: Sent ICs to workers in   1.715E-01 seconds.
149:
149: Initial conditions (root):
149: ------------------------
149: Min/max input density:   0.00E+00   1.93E+12
149: Min/max input velocity:  -2.03E+02   2.09E+02
149: Min/max input temperature:   0.00E+00   2.31E+03
149: Min/max input electric potential:   0.00E+00   0.00E+00
149: Min/max input electric potential (full grid):   0.00E+00   0.00E+00
149:  Priming electric field input
149:  Priming precipitation input
149:  Computing background and priming neutral perturbation input (if used)
149:  Initial neutral density and temperature (from MSIS) at time:          2011
149:            3          11   35100.0000000000       calculated in time:
149:   0.312159000000000
149:  Initial neutral winds (from HWM) at time:          2011           3          11
149:    35100.0000000000       calculated in time:    3.173400000000015E-002
149:  Computing alt,radial distance values for plasma grid and completing rotations
149:  ...Packing interpolation target points...
149:  ...Clearing out unit vectors (after projections)...
149:  Interpolation coords:    -263221.230730986        1900909.89335058
149:   -296886.936552027       -195577.868034098       -6571596.02207079
149:    107456.873471994
149:  Projection checking:   -1.110223024625157E-016  1.665334536937735E-016
149:  -1.110223024625157E-016  1.110223024625157E-016   1.00000000000000
149:    1.00000000000000
149: READ neutral size from:
149: /mnt/raid/ci/acoustic3D_cartesian_neutrals
149:  Neutral data has lx,ly,lz size:            80          90         100
149:   with spacing dx,dy,dz   9375.00000000000        8333.30000000000
149:    5000.00000000000
149:  ...creating vertical grid and sending to workers...
149:  Created full neutral grid with y,z extent:  -370312.500000000
149:    370312.500000000       -370831.850000000        370831.850000000
149:   0.000000000000000E+000   495000.000000000
149:  Receiving xn and yn ranges from workers...
149:  Subgrid extents:             1  0.000000000000000E+000   495000.000000000
149:   -296886.876918814       -193648.417092116       -522251.078754466
149:    214613.259946849
149:  Subgrid extents:             2  0.000000000000000E+000   495000.000000000
149:   -296886.810754322       -191743.575938421       -289155.953293666
149:   -3219366.75091746
149:  Subgrid extents:             3  0.000000000000000E+000   495000.000000000
149:   -296886.737598645       -189866.395390837       -92634.6175271534
149:   -3218529.37116980
149:  Subgrid extents:             4  0.000000000000000E+000   495000.000000000
149:   -211717.176038069       -121845.141358428       -787209.813083884
149:    107456.873471994
149:  Subgrid extents:             5  0.000000000000000E+000   495000.000000000
149:   -211717.133515980       -120643.149053914       -522251.078754466
149:    214613.259946849
149:  Subgrid extents:             6  0.000000000000000E+000   495000.000000000
149:   -211717.086336700       -119456.486277357       -289155.953293666
149:   -3219366.75091746
149:  Subgrid extents:             7  0.000000000000000E+000   495000.000000000
149:   -211717.034172271       -118287.053686738       -92634.6175271534
149:   -3218529.37116980
149:  Subgrid extents:             8  0.000000000000000E+000   495000.000000000
149:   -126545.896536252       -48109.7357850735       -787209.813083884
149:    107456.873471994
149:  Subgrid extents:             9  0.000000000000000E+000   495000.000000000
149:   -126545.871121792       -47635.1492264586       -522251.078754466
149:    214613.259946849
149:  Subgrid extents:            10  0.000000000000000E+000   495000.000000000
149:   -126545.842923865       -47166.6149248992       -289155.953293666
149:   -3219366.75091746
149:  Subgrid extents:            11  0.000000000000000E+000   495000.000000000
149:   -126545.811746387       -46704.8833410200       -92634.6175271534
149:   -3218529.37116980
149:  Subgrid extents:            12  0.000000000000000E+000   495000.000000000
149:   -41376.1359822077        29598.9176439113       -787209.813083884
149:    107456.873471994
149:  Subgrid extents:            13  0.000000000000000E+000   495000.000000000
149:   -41376.1276728206        29598.9116996979       -522251.078754466
149:    214613.259946849
149:  Subgrid extents:            14  0.000000000000000E+000   495000.000000000
149:   -41376.1184534658        29598.9051046234       -289155.953293666
149:   -3219366.75091746
149:  Subgrid extents:            15  0.000000000000000E+000   495000.000000000
149:   -41376.1082595573        29598.8978124041       -92634.6175271534
149:   -3218529.37116980
149:  Subgrid extents:            16  0.000000000000000E+000   495000.000000000
149:    37913.7733200904        114770.197162192       -787209.813083884
149:    107456.873471994
149:  Subgrid extents:            17  0.000000000000000E+000   495000.000000000
149:    37539.7672070096        114770.174112802       -522251.078754466
149:    214613.259946849
149:  Subgrid extents:            18  0.000000000000000E+000   495000.000000000
149:    37170.5306599960        114770.148538978       -289155.953293666
149:   -3219366.75091746
149:  Subgrid extents:            19  0.000000000000000E+000   495000.000000000
149:    36806.6550935514        114770.120262840       -92634.6175271534
149:   -3218529.37116980
149:  Subgrid extents:            20  0.000000000000000E+000   495000.000000000
149:    111649.302526122        199939.957703927       -787209.813083884
149:    107456.873471994
149:  Subgrid extents:            21  0.000000000000000E+000   495000.000000000
149:    110547.896637764        199939.917547599       -522251.078754466
149:    214613.259946849
149:  Subgrid extents:            22  0.000000000000000E+000   495000.000000000
149:    109460.537300580        199939.872993247       -289155.953293666
149:   -3219366.75091746
149:  Subgrid extents:            23  0.000000000000000E+000   495000.000000000
149:    108388.966127579        199939.823731047       -92634.6175271534
149:   -3218529.37116980
149:  Subgrid extents:            24  0.000000000000000E+000   495000.000000000
149:    185382.266985223        285111.237184720       -787209.813083884
149:    107456.873471994
149:  Subgrid extents:            25  0.000000000000000E+000   495000.000000000
149:    183553.413941987        285111.179917592       -522251.078754466
149:    214613.259946849
149:  Subgrid extents:            26  0.000000000000000E+000   495000.000000000
149:    181747.887160226        285111.116378336       -289155.953293666
149:   -3219366.75091746
149:  Subgrid extents:            27  0.000000000000000E+000   495000.000000000
149:    179968.578415731        285111.046125288       -92634.6175271534
149:   -3218529.37116980
149:  Subgrid extents:            28  0.000000000000000E+000   495000.000000000
149:    259114.470961689        370280.997668944       -787209.813083884
149:    107456.873471994
149:  Subgrid extents:            29  0.000000000000000E+000   495000.000000000
149:    256558.057569577        370280.923286192       -522251.078754466
149:    214613.259946849
149:  Subgrid extents:            30  0.000000000000000E+000   495000.000000000
149:    254034.255115698        370280.840756738       -289155.953293666
149:   -3219366.75091746
149:  Subgrid extents:            31  0.000000000000000E+000   495000.000000000
149:    251547.105534201        370280.749506894       -92634.6175271534
149:   -3218529.37116980
149:  Root grid check:    -370831.850000000        370831.850000000
149:  Converting ranges to indices...
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -296886.936552027
149:   -195577.868034121       -787209.813083884        107456.873471994
149:            1         100           8          20           1          59
149:           80          90
149:   -304687.500000000       -192187.500000000
149:   -370831.850000000        112499.550000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           0           1         100           8          20
149:            1          59
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -296886.876918814
149:   -193648.417092116       -522251.078754466        214613.259946849
149:            1         100           8          20           1          72
149:           80          90
149:   -304687.500000000       -192187.500000000
149:   -370831.850000000        220832.450000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           1           1         100           8          20
149:            1          72
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -296886.810754322
149:   -191743.575938421       -289155.953293666       -3219366.75091746
149:            1         100           8          21          10          11
149:           80          90
149:   -304687.500000000       -182812.500000000
149:   -295832.150000000       -287498.850000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           2           1         100           8          21
149:           10          11
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -296886.737598645
149:   -189866.395390837       -92634.6175271534       -3218529.37116980
149:            1         100           8          21          34          35
149:           80          90
149:   -304687.500000000       -182812.500000000
149:   -95832.9500000000       -87499.6500000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           3           1         100           8          21
149:           34          35
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -211717.176038069
149:   -121845.141358428       -787209.813083884        107456.873471994
149:            1         100          17          28           1          59
149:           80          90
149:   -220312.500000000       -117187.500000000
149:   -370831.850000000        112499.550000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           4           1         100          17          28
149:            1          59
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -211717.133515980
149:   -120643.149053914       -522251.078754466        214613.259946849
149:            1         100          17          28           1          72
149:           80          90
149:   -220312.500000000       -117187.500000000
149:   -370831.850000000        220832.450000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           5           1         100          17          28
149:            1          72
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -211717.086336700
149:   -119456.486277357       -289155.953293666       -3219366.75091746
149:            1         100          17          28          10          11
149:           80          90
149:   -220312.500000000       -117187.500000000
149:   -295832.150000000       -287498.850000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           6           1         100          17          28
149:           10          11
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -211717.034172271
149:   -118287.053686738       -92634.6175271534       -3218529.37116980
149:            1         100          17          28          34          35
149:           80          90
149:   -220312.500000000       -117187.500000000
149:   -95832.9500000000       -87499.6500000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           7           1         100          17          28
149:           34          35
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -126545.896536252
149:   -48109.7357850735       -787209.813083884        107456.873471994
149:            1         100          27          36           1          59
149:           80          90
149:   -126562.500000000       -42187.5000000000
149:   -370831.850000000        112499.550000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           8           1         100          27          36
149:            1          59
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -126545.871121792
149:   -47635.1492264586       -522251.078754466        214613.259946849
149:            1         100          27          36           1          72
149:           80          90
149:   -126562.500000000       -42187.5000000000
149:   -370831.850000000        220832.450000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices           9           1         100          27          36
149:            1          72
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -126545.842923865
149:   -47166.6149248992       -289155.953293666       -3219366.75091746
149:            1         100          27          36          10          11
149:           80          90
149:   -126562.500000000       -42187.5000000000
149:   -295832.150000000       -287498.850000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          10           1         100          27          36
149:           10          11
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -126545.811746387
149:   -46704.8833410200       -92634.6175271534       -3218529.37116980
149:            1         100          27          36          34          35
149:           80          90
149:   -126562.500000000       -42187.5000000000
149:   -95832.9500000000       -87499.6500000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          11           1         100          27          36
149:           34          35
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -41376.1359822077
149:    29598.9176439113       -787209.813083884        107456.873471994
149:            1         100          36          44           1          59
149:           80          90
149:   -42187.5000000000        32812.5000000000
149:   -370831.850000000        112499.550000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          12           1         100          36          44
149:            1          59
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -41376.1276728206
149:    29598.9116996979       -522251.078754466        214613.259946849
149:            1         100          36          44           1          72
149:           80          90
149:   -42187.5000000000        32812.5000000000
149:   -370831.850000000        220832.450000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          13           1         100          36          44
149:            1          72
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -41376.1184534658
149:    29598.9051046234       -289155.953293666       -3219366.75091746
149:            1         100          36          44          10          11
149:           80          90
149:   -42187.5000000000        32812.5000000000
149:   -295832.150000000       -287498.850000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          14           1         100          36          44
149:           10          11
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000       -41376.1082595573
149:    29598.8978124041       -92634.6175271534       -3218529.37116980
149:            1         100          36          44          34          35
149:           80          90
149:   -42187.5000000000        32812.5000000000
149:   -95832.9500000000       -87499.6500000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          15           1         100          36          44
149:           34          35
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        37913.7733200904
149:    114770.197162192       -787209.813083884        107456.873471994
149:            1         100          44          53           1          59
149:           80          90
149:    32812.5000000000        117187.500000000
149:   -370831.850000000        112499.550000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          16           1         100          44          53
149:            1          59
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        37539.7672070096
149:    114770.174112802       -522251.078754466        214613.259946849
149:            1         100          44          53           1          72
149:           80          90
149:    32812.5000000000        117187.500000000
149:   -370831.850000000        220832.450000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          17           1         100          44          53
149:            1          72
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        37170.5306599960
149:    114770.148538978       -289155.953293666       -3219366.75091746
149:            1         100          44          53          10          11
149:           80          90
149:    32812.5000000000        117187.500000000
149:   -295832.150000000       -287498.850000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          18           1         100          44          53
149:           10          11
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        36806.6550935514
149:    114770.120262840       -92634.6175271534       -3218529.37116980
149:            1         100          44          53          34          35
149:           80          90
149:    32812.5000000000        117187.500000000
149:   -95832.9500000000       -87499.6500000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          19           1         100          44          53
149:           34          35
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        111649.302526122
149:    199939.957703927       -787209.813083884        107456.873471994
149:            1         100          52          62           1          59
149:           80          90
149:    107812.500000000        201562.500000000
149:   -370831.850000000        112499.550000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          20           1         100          52          62
149:            1          59
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        110547.896637764
149:    199939.917547599       -522251.078754466        214613.259946849
149:            1         100          52          62           1          72
149:           80          90
149:    107812.500000000        201562.500000000
149:   -370831.850000000        220832.450000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          21           1         100          52          62
149:            1          72
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        109460.537300580
149:    199939.872993247       -289155.953293666       -3219366.75091746
149:            1         100          52          62          10          11
149:           80          90
149:    107812.500000000        201562.500000000
149:   -295832.150000000       -287498.850000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          22           1         100          52          62
149:           10          11
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        108388.966127579
149:    199939.823731047       -92634.6175271534       -3218529.37116980
149:            1         100          52          62          34          35
149:           80          90
149:    107812.500000000        201562.500000000
149:   -95832.9500000000       -87499.6500000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          23           1         100          52          62
149:           34          35
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        185382.266985223
149:    285111.237184720       -787209.813083884        107456.873471994
149:            1         100          60          71           1          59
149:           80          90
149:    182812.500000000        285937.500000000
149:   -370831.850000000        112499.550000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          24           1         100          60          71
149:            1          59
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        183553.413941987
149:    285111.179917592       -522251.078754466        214613.259946849
149:            1         100          60          71           1          72
149:           80          90
149:    182812.500000000        285937.500000000
149:   -370831.850000000        220832.450000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          25           1         100          60          71
149:            1          72
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        181747.887160226
149:    285111.116378336       -289155.953293666       -3219366.75091746
149:            1         100          59          71          10          11
149:           80          90
149:    173437.500000000        285937.500000000
149:   -295832.150000000       -287498.850000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          26           1         100          59          71
149:           10          11
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        179968.578415731
149:    285111.046125288       -92634.6175271534       -3218529.37116980
149:            1         100          59          71          34          35
149:           80          90
149:    173437.500000000        285937.500000000
149:   -95832.9500000000       -87499.6500000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          27           1         100          59          71
149:           34          35
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        259114.470961689
149:    370280.997668944       -787209.813083884        107456.873471994
149:            1         100          68          80           1          59
149:           80          90
149:    257812.500000000        370312.500000000
149:   -370831.850000000        112499.550000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          28           1         100          68          80
149:            1          59
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        256558.057569577
149:    370280.923286192       -522251.078754466        214613.259946849
149:            1         100          67          80           1          72
149:           80          90
149:    248437.500000000        370312.500000000
149:   -370831.850000000        220832.450000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          29           1         100          67          80
149:            1          72
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        254034.255115698
149:    370280.840756738       -289155.953293666       -3219366.75091746
149:            1         100          67          80          10          11
149:           80          90
149:    248437.500000000        370312.500000000
149:   -295832.150000000       -287498.850000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          30           1         100          67          80
149:           10          11
149:  !!!!!!!!!!!!!!!!!
149:            0
149:   0.000000000000000E+000   495000.000000000        251547.105534201
149:    370280.749506894       -92634.6175271534       -3218529.37116980
149:            1         100          67          80          34          35
149:           80          90
149:    248437.500000000        370312.500000000
149:   -95832.9500000000       -87499.6500000000
149:  !!!!!!!!!!!!!!!!!
149:  Subgrid indices          31           1         100          67          80
149:           34          35
149:  Sending sizes and xn,yn subgrids to workers...
149:  Root is picking out its own subgrid...
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149:    Priming dataset:  neutral perturbations (3D)
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F1BD4936493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F53525D4493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F82A2D71493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007FA5B0847493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F3236F28493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F018251B493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007FDBE6E31493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F4D7FC84493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007FFA9BF2A493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007FB38FEE0493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F8C8C98B493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F7479DC9493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F054A0C4493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007FF4DDDDA493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F0DD1AB7493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149: forrtl: severe (408): fort: (3): Subscript #1 of the array X3 has value 0 which is less than the lower bound of 1
149:
149: Image              PC                Routine            Line        Source
149: gemini.bin         0000000000F7764F  Unknown               Unknown  Unknown
149: gemini.bin         00000000004AAA34  interpolation_mp_         171  interpolation.f90
149: gemini.bin         00000000004DA6D5  inputdataobj_mp_s         589  inputdataobj.f90
149: gemini.bin         00000000004D1570  inputdataobj_mp_u         420  inputdataobj.f90
149: gemini.bin         000000000051A299  neutraldata3dobj_         621  neutraldata3Dobj.f90
149: gemini.bin         00000000004D135B  inputdataobj_mp_p         347  inputdataobj.f90
149: gemini.bin         0000000000528CC8  neutraldata3dobj_         160  neutraldata3Dobj.f90
149: gemini.bin         0000000000602F0B  neutral_mp_init_n         125  neutral.f90
149: gemini.bin         0000000000803B29  gemini_main               316  libgemini.f90
149: gemini.bin         000000000040781C  MAIN__                     55  gemini_main.f90
149: gemini.bin         0000000000407762  Unknown               Unknown  Unknown
149: libc-2.28.so       00007F4B6A7FD493  __libc_start_main     Unknown  Unknown
149: gemini.bin         000000000040766E  Unknown               Unknown  Unknown
149:
149: gemini.bin run failure
6/6 Test #149: run_bounds_check:tohoku20113D_lowres_3Dneu .............***Failed    2.92 sec

occasional build error h5fortran

I'm not sure exactly how to repeat this, but when it happens it happens over and over. On Ubuntu 20.04 at least.

FAILED: src/numerical/grid/CMakeFiles/grid.dir/readgrid_hdf5.f90.o include/grid@readgrid_hdf5.smod
/usr/bin/gfortran -Igemini3d/src/numerical/grid -Igemini3d/build/include -I/usr/include/hdf5/serial -I/usr/include -I/usr/lib/x86_64-linux-gnu/openmpi/include -I/usr/lib/x86_64-linux-gnu/openmpi/lib -fimplicit-none -Wno-unused-dummy-argument -Wno-unused-variable -Wno-unused-function -fno-backtrace -Wno-do-subscript -Wno-maybe-uninitialized -O3 -DNDEBUG -O3 -Jinclude -mtune=native -pthread -fpreprocessed -c src/numerical/grid/CMakeFiles/grid.dir/readgrid_hdf5.f90-pp.f90 -o src/numerical/grid/CMakeFiles/grid.dir/readgrid_hdf5.f90.o
gemini3d/src/numerical/grid/readgrid_hdf5.f90:4:4:

    4 | use h5fortran, only: hdf5_file
      |    1
Fatal Error: Mismatch in components of derived type โ€˜__vtype_h5fortran_Hdf5_fileโ€™ from โ€˜h5fortranโ€™ at (1): expecting โ€˜closeโ€™, but got โ€˜chunksโ€™
compilation terminated.

Make test data autoupdate when new test data version uploaded

Currently, if new test data is uploaded to Zenodo, users that already downloaded a prior version of the data will probably get failing ctest for simulations. This is because the test data does not auto-update.

A useful UX enhancement would be to do more than just check that a folder exists under gemini3d/tests/data/test*. We could make a version file or NML or HDF5 variable that Python checks and downloads new data. This would require regenerating the existing test data to include this new variable.

An alternate approach that might give backward compatibility is to check the data of a file in the tests/data/tests*. I think this might be fragile as I think some .zip extractors or file systems might not preserve the date. For little more development time, better to put a value -- I think just a semantic data version number would be appropriate.

Bounds Check: Tohoku 3D tohoku20113D_lowres_3Dneu

error log:

135/148 Testing: run_bounds_check:tohoku20113D_lowres_3Dneu
135/148 Test: run_bounds_check:tohoku20113D_lowres_3Dneu
Command: "/home/beef/code/gemini3d/build/Debug/gemini3d.run.debug" "/mnt/raid/ci/tohoku20113D_lowres_3Dneu" "-mpiexec" "/usr/lib64/openmpi/bin/mpiexec" "-dryrun"
Directory: /home/beef/code/gemini3d/build/Release
"run_bounds_check:tohoku20113D_lowres_3Dneu" start time: Oct 27 17:28 EDT
Output:
----------------------------------------------------------
gemini3d.run: detected CPU count: 32
MPI partition of lx2, lx3: 128 48 is lid2, lid3: 4 8
MPI images: 32

<snip>

/mnt/raid/ci/tohoku20113D_lowres_3Dneu/inputs/simsize.h5 input dimensions:   512   128    48
Target (output) grid structure dimensions:   512   128    48
Sent ICs to workers in   2.396E-01 seconds.

Initial conditions (root):
------------------------
Min/max input density:   0.00E+00   1.93E+12
Min/max input velocity:  -2.03E+02   2.09E+02
Min/max input temperature:   0.00E+00   2.31E+03
Min/max input electric potential:   0.00E+00   0.00E+00
Min/max input electric potential (full grid):   0.00E+00   0.00E+00
 Priming electric field input
 Priming precipitation input
 Computing background and priming neutral perturbation input (if used)
 Initial neutral density and temperature (from MSIS) at time:          2011           3          11   35100.000000000000       calculated in time:    0.47727200000000014
 Initial neutral winds (from HWM) at time:          2011           3          11   35100.000000000000       calculated in time:     3.9384000000000086E-002
 !!!Attempting initial load of neutral dynamics files!!! This is a workaround to insure compatibility with restarts...        2011           3          11   35100.000000000000
 Computing alt,radial distance values for plasma grid and completing rotations
 ...Clearing out unit vectors (after projections)...
 Projection checking:    -1.1102230246251565E-016   1.6653345369377348E-016  -1.1102230246251565E-016   1.1102230246251565E-016  0.99999999999999989        1.0000000000000000
READ neutral size from:
/mnt/raid/ci/acoustic3D_cartesian_neutrals
 Neutral data has lx,ly,lz size:            80          90         100  with spacing dx,dy,dz   9375.0000000000000        8333.2999999999993        5000.0000000000000
 ...creating vertical grid and sending to workers...
 Created full neutral grid with y,z extent:  -370312.50000000000        370312.50000000000       -370831.84999999998        370831.84999999998        0.0000000000000000        495000.00000000000
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
 Receiving xn and yn ranges from workers...
 Subgrid extents:             1   0.0000000000000000        495000.00000000000       -296886.87691879936       -193648.41709209242       -522251.07875446643        214613.25994687024
At line 615 of file /home/beef/code/gemini3d/src/neutral/proj.f90
Fortran runtime error: Index '0' of dimension 1 of array 'yitmp' below lower bound of 1

Error termination. Backtrace:
#0  0x7fbd1360d171 in ???
#1  0x7fbd1360dd19 in ???
#2  0x7fbd1360e0fb in ???
#0  0x7f7c1567e171 in ???
#1  0x7f7c1567ed19 in ???
#2  0x7f7c1567f0fb in ???
#0  0x7f899d46d171 in ???
#1  0x7f899d46dd19 in ???
#2  0x7f899d46e0fb in ???
#0  0x7f21a76ef171 in ???
#1  0x7f21a76efd19 in ???
#2  0x7f21a76f00fb in ???
#0  0x7f4b4b532171 in ???
#1  0x7f4b4b532d19 in ???
#2  0x7f4b4b5330fb in ???
#0  0x7f6b16dcc171 in ???
#1  0x7f6b16dccd19 in ???
#2  0x7f6b16dcd0fb in ???
#0  0x7fca902f3171 in ???
#1  0x7fca902f3d19 in ???
#2  0x7fca902f40fb in ???
#0  0x7f486cb4e171 in ???
#1  0x7f486cb4ed19 in ???
#2  0x7f486cb4f0fb in ???
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#0  0x7f90df9c1171 in ???
#1  0x7f90df9c1d19 in ???
#2  0x7f90df9c20fb in ???
#0  0x7f9ee0b2c171 in ???
#1  0x7f9ee0b2cd19 in ???
#2  0x7f9ee0b2d0fb in ???
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#0  0x7faa9aab8171 in ???
#1  0x7faa9aab8d19 in ???
#2  0x7faa9aab90fb in ???
#0  0x7feb86fd9171 in ???
#1  0x7feb86fd9d19 in ???
#2  0x7feb86fda0fb in ???
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#0  0x7f3433fb5171 in ???
#1  0x7f3433fb5d19 in ???
#2  0x7f3433fb60fb in ???
#0  0x7ff0ca10f171 in ???
#1  0x7ff0ca10fd19 in ???
#2  0x7ff0ca1100fb in ???
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#0  0x7fcff6fa2171 in ???
#1  0x7fcff6fa2d19 in ???
#2  0x7fcff6fa30fb in ???
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#0  0x7f9a7591d171 in ???
#1  0x7f9a7591dd19 in ???
#2  0x7f9a7591e0fb in ???
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#3  0x5683f6 in __neutral.proj_MOD_slabrange
	at /home/beef/code/gemini3d/src/neutral/proj.f90:615
#4  0x5791b3 in __neutral.perturb_MOD_gridproj_dneu3d
	at /home/beef/code/gemini3d/src/neutral/proj.f90:509
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#5  0x55b3ea in __neutral.perturb_MOD_neutral_perturb_3d
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:264
#6  0x55f842 in __neutral_MOD_neutral_perturb
	at /home/beef/code/gemini3d/src/neutral/perturb.f90:63
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
#7  0x53c7bf in __neutral_MOD_init_neutrals
	at /home/beef/code/gemini3d/src/neutral/neutral.f90:183
#8  0x412369 in gemini_main
	at /home/beef/code/gemini3d/src/libgemini.f90:312
#9  0x405fe4 in gemini3d_main
	at /home/beef/code/gemini3d/src/gemini_main.f90:55
#10  0x406300 in main
	at /home/beef/code/gemini3d/src/gemini_main.f90:21
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[29154,1],2]
  Exit code:    2
--------------------------------------------------------------------------
[beefy:683918] 31 more processes have sent help message help-mpi-btl-openib-cpc-base.txt / no cpcs for port
[beefy:683918] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
ERROR STOP gemini.bin run failure

Error termination. Backtrace:
#0  0x7f40ce4a9171 in ???
#1  0x7f40ce4a9d19 in ???
#2  0x7f40ce4aae8e in ???
#3  0x446be1 in gemini3d_run
	at /home/beef/code/gemini3d/src/utils/gemini3d_run.f90:52
#4  0x446c34 in main
	at /home/beef/code/gemini3d/src/utils/gemini3d_run.f90:5
<end of output>
Test time =   5.39 sec
----------------------------------------------------------
Test Failed.
"run_bounds_check:tohoku20113D_lowres_3Dneu" end time: Oct 27 17:28 EDT
"run_bounds_check:tohoku20113D_lowres_3Dneu" time elapsed: 00:00:05
----------------------------------------------------------

MPI grid error checking

The code error checks that the grid size is conformable with the MPI process grid x2 and x3 split, but this is done before the final MPI process gridding is completed. I think the error checking in grid_size needs to be move to some place after mpi_grid() or mpi_manualgrid() have been called.

Periodic potential solves fail with gcc bounds checking

 Root has communicated type of solve to workers:             0
 Workers has computed background field currents...
 Workers has computed wind currents...
 Beginning field-integrated solve...
 Using FAC boundary condition...
 Root is calling MUMPS...
 !!!User selected periodic solve...
At line 257 of file /Users/zettergm/Projects/gemini3d/build/src/numerical/calculus/calculus.f90
Fortran runtime error: Dimension 2 of array 'grad2d1_curv_alt_23' has extent 128 instead of 129

Error termination. Backtrace:

Could not print backtrace: executable file is not an executable
#0  0x10386fd3e
#1  0x1038709e5
#2  0x103870fb6
#3  0x1022cf270
#4  0x10217030b
#5  0x1020d425d
#6  0x102098175
#7  0x101f9b0bc
#8  0x101f9cabf
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[8398,1],0]
  Exit code:    2
--------------------------------------------------------------------------

EIA bug fix needed

EIA implementation provides unrealistic wind patterns on a stretched grid and needs to be checked. Can be a problem with specification of winds over various altitudes.

Screen Shot 2020-11-03 at 8 31 21 AM

Ubuntu Linux ARM blacs link issue

I was able to replicate @mattzett issue. Matt is using Mac M1 with Ubuntu 20.04 in Parallels VM.

The error was like:

Screen Shot 2021-09-30 at 9 15 29 AM

I used Ubuntu 21.04 on Raspberry Pi 4, installed 64-bit that is

$ uname -m
aarch64

speed up build to get to the error like:

cmake --build build --target gemini.bin

Working on aarch64 Raspi 4 with Ubuntu 21.04:

  • all system libraries e.g. libmumps-dev libscalapack-openmpi-dev and so on.
  • cmake -Dmumps_external=on

NOT working on aarch64 Raspi 4:

  • autobuild cmake --preset build all libraries (except MPI)
  • cmake -B build -Dscalapack_external=on

works on CentOS 8 x86_64:

  • cmake -B build all system libraries
  • cmake -B build -Dscalapack_external=yes
  • cmake -B build -Dscalapack_external=yes -Dmumps_external=yes

I notice on Windows MSYS2 that MPI is linked after mumps but before scalapack, yet it links fine. Is this the same link order seen on Ubuntu that fails?

hdf5 not autobuilding

If it's a problem getting Gemini to build (compile), please let us know the output of these commands (stopping at the command that fails)

cmake -B build
-- HDF5 include: /usr/local/include
-- HDF5 library: /usr/local/lib/libhdf5_hl_fortran.a;/usr/local/lib/libhdf5_hl.a;/usr/local/lib/libhdf5.a;/usr/local/lib/libhdf5_fortran.a;/usr/local/lib/libhdf5.a;/usr/local/lib/libhdf5_fortran.a;/usr/local/lib/libhdf5.a;/usr/local/lib/libhdf5_fortran.a;/usr/local/lib/libhdf5.a;SZIP::SZIP;ZLIB::ZLIB;Threads::Threads;m
-- Performing Test HDF5_compiles_ok
-- Performing Test HDF5_compiles_ok - Failed
-- Performing Test HDF5_runs_ok
-- Performing Test HDF5_runs_ok - Failed
-- h5fortran: HDF5 not working
CMake Error at cmake/h5fortran.cmake:18 (message):
  HDF5 was requested but is not available.
Call Stack (most recent call first):
  CMakeLists.txt:60 (include)

Intel: interp segfaults

This issue was noticed a couple months ago, but was not addressed then due to thinking it was possibly caused elsewhere.

With the intel 2020 compiler on Linux at least, tests interp2 and interp3 immediately segfault on line 1 of the code in Release or Debug. see https://github.com/gemini3d/GEMINI/wiki/Reference-tests:-Intel-compiler for test names failing.

Possible approach:

  1. temporarily remove HDF5 file IO in these tests alone to see if it's a quirk in how HDF5 is being used in this test
  2. is it a bug in the test code (seen when not using HDF5)
  3. an actual interpolation code bug not detected by Gfortran. Is it an array bounds issue?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.