ecco-access's People
ecco-access's Issues
Actions needed for next V4 dataset release
Issues to resolve before next release (V4r5)
NetCDF Processing
- Prepare grid files FIRST, use when generating other granules
- assign flux or velocity quantities their own 'bounds' variable that provides the lat/lon coordinates for the two grid cell "corners"
- define collection summaries separately, assign keys to collection title or PODAAC shortname or ID
- fold processing of 1D and 3D time-invariant cases into the main NetCDF generation code
- deliver collection summary text to podaac at the end for their CMR
- save each variable's min/max within the granules of each collection when first constructing netcdf and sweep those at the end to get global min/max for collection/variable
Metadata
- descriptions of DFrI_TH and _SLT should mention contributions from GGL
- momentum budget terms
- possibly new sea-ice terms
- new ice-shelf terms
- Define EmPmR in oceqnet metadata
Code Change
- add missing sea-ice diagnosic terms
- add momentum budget tendency terms
- mask sea-ice velocity where there is no sea ice
- add accurate mixed layer depths (use GGL not Kara)
New Datasets
- GM streamfunction 'rotated' to lat-lon
- https://journals.ametsoc.org/view/journals/phoc/28/6/1520-0485_1998_028_1205_moadti_2.0.co_2.xml
- barotropic streamfunctions
- https://xgcm.readthedocs.io/en/latest/xgcm-examples/02_mitgcm.html?highlight=barotropic#Barotropic-Transport-Streamfunction
- basic climatology fields (T, S, rho; U, V, W; SSH; OBP; sea-ice concentration, sea-ice and snow thicknesses; sea-ice velocity; others?)
- momentum budget closure terms
- ocean heat content 1D time series (full depth, 0-700m & 0-2000m; global and basin)
- OBP anomaly, (OBPANOM) remove reference H * rho, possibly at run time.
Offline adjustment
- Melt rate: time-mean and spin-up
- Global mean SSH and OBP
Checking budget closure
- Temperature budget
- Salinity budget
- Momentum
- sea-ice volume
- sea-ice energy
Documentation
- Synopsis
- User guide
- -- add diagram of different air/sea heat fluxes (EXFqnet, oceQnet, TFLUX, etc.)
- Budget
Figures
(cost plots generally include V4r4 [using V4r5 data and error], V4r5 iteration 0, and V4r5)
- Cost bar plot
- Global mean SSH and OBP (against observation-based estimate)
- Total melt rate time-series (against observation-based estimate)
- Cost vs time
- LSC SSH
- OBP
- Argo
- SST
- SSS (Aquarius, SMOS, and SMAP)
- Sea-ice concentration
- Cost vs space
- MDT
- LSC SSH
- OBP
- Argo
- SST
- SSS (Aquarius, SMOS, and SMAP)
- Sea-ice concentration
- Climatology TS
- Melt rate (Only V4r5 iteration 0 and final V4r5 release)
- gcmfaces plots
Other
- River Runoff and Iceberg Runoff should be in forcing directories, not init directories.
- Geothermal flux should also be in forcing directories, not init directories.
EXFqnet direction attribute
The "direction" attribute for EXFqnet is incorrect. The correct one should be
"direction": ">0 decrease potential temperature (THETA)",
The existing and incorrect one is
"direction": ">0 increases potential temperature (THETA)",
The bug was found by Andrew Delman.
Dry/wet points determination in the interpolated fields
Add information to the meta data about how dry/wet points are determined in the interpolated fields.
A smaller argument of for 'generate_netcdfs' could be possible
From SASSIE. most directories are now coded in the product_generation_config.json file in 'metadata_dir'.
'array precision' should be in product_generation and not hard coded (what was I thinking?)
'time_step_selection_method' is nothing to replicate.
G, ecco_grid = generate_netcdfs(output_freq_code, job_id, num_jobs,
product_type,
metadata_dir,
array_precision,
grouping_to_process,
time_step_selection_method,
debug_mode)
wrap long lines
bounds should be called latlon_bounds
latlon grid can be pre-made
just like the mapping factors
look for file paths using exact match (after splitting on /), instead of 'in'
metadata without all the hard coding of ECCO V4r4
#
metadata_dir = '/home/ifenty/git_repos_others/SASSIE/ECCO/metadata/SASSIE_N1_metadata_json'
metadata = load_all_metadata(metadata_dir)
product_generation_config = metadata['product_generation_config']
# metadata for different variables
global_metadata_for_all_datasets = metadata['global_metadata_for_all_datasets']
global_metadata_for_latlon_datasets = metadata['global_metadata_for_latlon_datasets']
global_metadata_for_native_datasets = metadata['global_metadata_for_native_datasets']
coordinate_metadata_for_1D_datasets = metadata['coordinate_metadata_for_1D_datasets']
coordinate_metadata_for_latlon_datasets = metadata['coordinate_metadata_for_latlon_datasets']
coordinate_metadata_for_native_datasets = metadata['coordinate_metadata_for_native_datasets']
geometry_metadata_for_latlon_datasets = metadata['geometry_metadata_for_latlon_datasets']
geometry_metadata_for_native_datasets = metadata['geometry_metadata_for_native_datasets']
groupings_for_1D_datasets = metadata['groupings_for_1D_datasets']
groupings_for_latlon_datasets = metadata['groupings_for_latlon_datasets']
groupings_for_native_datasets = metadata['groupings_for_native_datasets']
variable_metadata_latlon = metadata['variable_metadata_for_latlon_datasets']
variable_metadata_default = metadata['variable_metadata']
variable_metadata_native = variable_metadata_default + geometry_metadata_for_native_datasets
nk = int(ecco.find_metadata_in_json_dictionary('num_vertical_levels', 'name',\
metadata['product_generation_config'])['value'])
binary_fill_value = int(ecco.find_metadata_in_json_dictionary('binary_fill_value', 'name',\
metadata['product_generation_config'])['value'])
num_vertical_levels_to_process =\
int(ecco.find_metadata_in_json_dictionary('num_vertical_levels_to_process', 'name',\
metadata['product_generation_config'])['value'])
model_start_time =\
np.datetime64(ecco.find_metadata_in_json_dictionary('model_start_time', 'name',\
metadata['product_generation_config'])['value'])
model_end_time =\
np.datetime64(ecco.find_metadata_in_json_dictionary('model_start_time', 'name',\
metadata['product_generation_config'])['value'])
model_time_step =\
int(ecco.find_metadata_in_json_dictionary('model_time_step', 'name',\
metadata['product_generation_config'])['value'])
diags_root_dir = Path(ecco.find_metadata_in_json_dictionary('diags_root_dir', 'name',\
metadata['product_generation_config'])['value'])
output_dir_base = Path(ecco.find_metadata_in_json_dictionary('output_dir_base','name',\
metadata['product_generation_config'])['value'])
print('\n')
print('model_time_step ', model_time_step)
print('model_start_time', model_start_time)
print('model_end_time', model_end_time)
print('nk=', nk)
#
# load PODAAC fields
#podaac_dataset_table = read_csv(podaac_dir / 'datasets.csv')
ecco_grid_dir = Path(ecco.find_metadata_in_json_dictionary('ecco_grid_dir', 'name',
metadata['product_generation_config'])['value'])
print(ecco_grid_dir)
ecco_grid_filename = Path(ecco.find_metadata_in_json_dictionary('ecco_grid_filename', 'name',
metadata['product_generation_config'])['value'])
print(ecco_grid_dir / ecco_grid_filename)
lat lon grid specs
should go in the metadata file "product_generation_config.json"
adding metadata to coordinates can be done after the merge
record time dictionary near where record times are defined ( a few lines up)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.