Giter Site home page Giter Site logo

xcdat / xcdat Goto Github PK

View Code? Open in Web Editor NEW
96.0 9.0 9.0 12.02 MB

An extension of xarray for climate data analysis on structured grids.

Home Page: https://xcdat.readthedocs.io/en/latest/

License: Apache License 2.0

Makefile 0.35% Python 99.65%
climate-data-analysis xarray climate-science python climate-data cdat xcdat climate-research climate-analysis climate-sciences

xcdat's Introduction

xCDAT logo

Xarray Climate Data Analysis Tools

Badges
Distribution conda-forge platforms conda-downloads
Citation zenodo-doi
DevOps CI/CD Build Workflow codecov docs
Quality Assurance pre-commit black flake8 mypy

xCDAT is an extension of xarray for climate data analysis on structured grids. It serves as a modern successor to the Community Data Analysis Tools (CDAT) library.

Useful links: Documentation | Code Repository | Issues | Discussions | Releases | Mailing List

Project Motivation

The goal of xCDAT is to provide generalizable features and utilities for simple and robust analysis of climate data. xCDAT's design philosophy is focused on reducing the overhead required to accomplish certain tasks in xarray. xCDAT aims to be compatible with structured grids that are CF-compliant (e.g., CMIP6). Some key xCDAT features are inspired by or ported from the core CDAT library, while others leverage powerful libraries in the xarray ecosystem (e.g., xESMF, xgcm, cf_xarray) to deliver robust APIs.

The xCDAT core team's mission is to provide a maintainable and extensible package that serves the needs of the climate community in the long-term. We are excited to be working on this project and hope to have you onboard!

Getting Started

The best resource for getting started is the xCDAT documentation website. Our documentation provides general guidance for setting up xCDAT in an Anaconda environment on your local computer or on an HPC/Jupyter environment. We also include an API Overview and Gallery to highlight xCDAT functionality.

Community

xCDAT is a community-driven open source project. We encourage discussion on topics such as version releases, feature suggestions, and architecture design on the GitHub Discussions page.

Subscribe to our mailing list for news and announcements related to xCDAT, such as software version releases or future roadmap plans.

Please note that xCDAT has a Code of Conduct. By participating in the xCDAT community, you agree to abide by its rules.

Contributing

We welcome and appreciate contributions to xCDAT. Users and contributors can view and open issues on our GitHub Issue Tracker.

For more instructions on how to contribute, please checkout our Contributing Guide.

Features

  • Extension of xarray's open_dataset() and open_mfdataset() with post-processing options
    • Generate bounds for axes supported by xcdat if they don't exist in the Dataset
    • Optional selection of single data variable to keep in the Dataset (bounds are also kept if they exist)
    • Optional decoding of time coordinates
      • In addition to CF time units, also decodes common non-CF time units ("months since ...", "years since ...")
    • Optional centering of time coordinates using time bounds
    • Optional conversion of longitudinal axis orientation between [0, 360) and [-180, 180)
  • Temporal averaging
    • Time series averages (single snapshot and grouped), climatologies, and departures
    • Weighted or unweighted
    • Optional seasonal configuration (e.g., DJF vs. JFD, custom seasons)
  • Geospatial weighted averaging
    • Supports rectilinear grid
    • Optional specification of regional domain
  • Horizontal structured regridding
    • Supports rectilinear and curvilinear grids
    • Extends the xESMF horizontal regridding API
    • Python implementation of regrid2 for handling cartesian latitude longitude grids
  • Vertical structured regridding
    • Support rectilinear and curvilinear grids
    • Extends the xgcm vertical regridding API

Things We Are Striving For

  • xCDAT supports CF compliant datasets, but will also strive to support datasets with common non-CF compliant metadata (e.g., time units in "months since ..." or "years since ...")
    • xCDAT leverages cf_xarray to interpret CF attributes on xarray objects
    • Refer to CF Convention for more information on CF attributes
  • Robust handling of dimensions and their coordinates and coordinate bounds
    • Coordinate variables are retrieved with cf_xarray using CF axis names or coordinate names found in xarray object attributes. Refer to Metadata Interpretation for more information.
    • Bounds are retrieved with cf_xarray using the "bounds" attr
    • Ability to operate on both longitudinal axis orientations, [0, 360) and [-180, 180)
  • Support for parallelism using dask where it is both possible and makes sense

Releases

xCDAT (released as xcdat) follows a feedback-driven release cycle using continuous integration/continuous deployment. Software releases are performed based on the bandwidth of the development team, the needs of the community, and the priority of bug fixes or feature updates.

After releases are performed on GitHub Releases, the corresponding xcdat package version will be available to download through Anaconda conda-forge usually within a day.

Subscribe to our mailing list to stay notified of new releases.

Useful Resources

We highly encourage you to checkout the awesome resources below to learn more about Xarray and Xarray usage in climate science!

Projects Using xCDAT

xCDAT is actively being integrated as a core component of the Program for Climate Model Diagnosis and Intercomparison (PCMDI) Metrics Package and the Energy Exascale Earth System Model Diagnostics (E3SM) Package. xCDAT is also included in the E3SM Unified Anaconda Environment that is deployed on various U.S. Department of Energy supercomputers to run E3SM software tools.

Acknowledgement

xCDAT is jointly developed by scientists and developers from the Energy Exascale Earth System Model (E3SM) Project and Program for Climate Model Diagnosis and Intercomparison (PCMDI). The work is performed for the E3SM project, which is sponsored by Earth System Model Development (ESMD) program, and the Simplifying ESM Analysis Through Standards (SEATS) project, which is sponsored by the Regional and Global Model Analysis (RGMA) program. ESMD and RGMA are programs for the Earth and Environmental Systems Sciences Division (EESSD) in the Office of Biological and Environmental Research (BER) within the Department of Energy's Office of Science.

Contributors

Thank you to all of our contributors!

xCDAT contributors

License

xCDAT is licensed under the terms of the Apache License (Version 2.0 with LLVM exception).

All new contributions must be made under the Apache-2.0 with LLVM exception license.

See LICENSE and NOTICE for details.

SPDX-License-Identifier: Apache-2.0

LLNL-CODE-846944

xcdat's People

Contributors

acordonez avatar chengzhuzhang avatar durack1 avatar jasonb5 avatar lee1043 avatar pochedls avatar tomvothecoder avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

xcdat's Issues

Support for time units of "months since..."

Is your feature request related to a problem? If yes, Please describe

One feature of CDAT is its ability to flexibly read in a large number of time units. In testing some code, I realize that xarray does not natively support "months since ..."

import xarray as xr

fn = 'rss_4.0_ttt_197901-202012.nc'
ds = xr.open_dataset(fn)

ValueError: unable to decode time units 'months since 1979-01-01' with "calendar 'standard'". Try opening your dataset with decode_times=False or installing cftime if it is not installed.

Describe your proposed solution

Ideally, we would be able to open and decode files like this, perhaps with other packages (e.g., nctime). Could a fix for this issue (and potentially other fixes) be included in xcdat utilities (e.g., open_datasets)?

Describe alternatives solutions you've considered

One obvious alternative is to force the user to use the decode_times=False option and deal with the time axis on their own or deal with the axis information after loading the dataset.

Additional context

If we had a utility to help load data, we could also add axis information (e.g., bounds) when opening the dataset and include the option of stripping out extraneous variables, e.g., tas = xcdat.open_dataset(fn, variable='tas').

Additional tasks based on 7/21 meeting

  • Take into account dataset's calendar when decoding time

Add napoleon extension to replace reST with numpy docstring

Describe your documentation update

Update docstring style from reST to numpy docstring format for readability and alignment with xarray.

https://sphinxcontrib-napoleon.readthedocs.io/en/latest/sphinxcontrib.napoleon.html
https://github.com/pydata/xarray/blob/master/doc/conf.py

Requirements

  • Add sphinxcontrib.napoleon to conda dev and rtd envs
  • Add settings to conf.py from xarray conf.py
  • Add VSCode workspace with autoDocstring.docstringformat : numpy setting
  • Update existing docstrings to numpy docstring

Add functions to calculate essential statistical metrics

First, determine absolute must-have functions for calculating statistical metrics to avoid scope-creep. Then check if existing third-party libraries (NumPy, Pandas, xarray) already support them out-of-the-box.

Describe the solution you'd like

Standard deviation, root mean square error, correlation: grid-aware(i.e. weighted)

Acceptance criteria:

  • Decide if we want to wrap NumPy/SciPy calls into utility method(s), or have user rely on library APIs directly

Additional context

CDAT equivalent:

Add temporal averaging functionality (time series avg, climatology and departure)

Describe the solution you'd like

Temporal averaging for time series, climatology, and departures for different time frequencies with the option for weighted or unweighted means.

Acceptance criteria

Bounds

  • #90
  • Uses time bounds to recompute time, or throw error if it doesn't exist.

Weights

  • Weighted means (xcdat default)
    • If time bounds exist, generates weights using it
    • If time bounds don't exist, raise error that time bounds must be added.
    • Considers leap years (uses datetime component days_in_month). More info here
    • Handles getting weights for all time intervals (hourly, daily, monthly, seasonal, yearly)
  • Unweighted means

Timeseries Averaging

  • annual
  • monthly
  • season
  • custom season
  • day
  • hour
  • nHour (nice-to-have)

Climatology

  • month (aka ANNUALCYCLE) -- monthly means for each month of the year for all years
  • season (aka SEASONALCYCLE) -- means for the 4 predefined seasons for all years
    • Handle DJF season
      • continuous (previous year December, remove incomplete seasons) (xcdat default)
      • discontinuous (same year December) (xarray default, there is no continuous variant for seasonal grouping)
  • custom season
  • day -- daily means for each day of the year for all years
  • diurnalNN (nice-to-have)

Departure

  • Handles monthly, seasonal, and annual climatology
  • Handles hourly climatology

Edge Cases

  • Time and output time for climatology requires special handling. The month and day should be centered, but the year needs to be selected (first year, last year?). The mid year may not be an integer.
  • Confirm decoding non-CF time units produces the correct calendar type, otherwise use existing calendar

Additional context

https://stackoverflow.com/questions/59234745/is-there-any-easy-way-to-compute-seasonal-mean-with-xarray
https://cdat.llnl.gov/documentation/utilities/utilities-1.html
Prototypes:

Update docs theme and settings

Describe your documentation update

  • Replace alablaster theme with sphinx_book_theme
  • Add ability to copy code cells sphinx_copy_button
  • Add link to the xarray docs

Add tbump to simply package versioning process

Is your feature request related to a problem? If yes, Please describe

Manually updating the version string in the necessary files is tedious and error-prone.

Describe your proposed solution

tbump is a Python package that automatically bumps the version of the software in the applicable files by running a simple command (e.g., tbump 1.2.0, tbump 1.2.0rc1)

Describe alternatives solutions you've considered

bump2version is another, more popular package. However, it requires custom configuration to support PEP440 which can become unwieldy.

https://medium.com/tanker-blog/automating-version-number-updates-what-could-go-wrong-75edd190ee47

Axis attributes dropped after retrieving/generating bounds

I noticed that if I generate or retrieve axis bounds, some of the axis information is dropped. Ideally, all of the axis information should be retained when generating bounds.

What are the steps to reproduce this issue?

from xcdat import axis

fn = '/p/css03/esgf_publish/CMIP6/CMIP/NCAR/CESM2/historical/r1i1p1f1/Amon/tas/gn/v20190308/tas_Amon_CESM2_historical_r1i1p1f1_gn_185001-201412.nc'
ds = xr.open_dataset(fn)
print(ds.lat.units)

degrees_north

lat_bnds = ds.axis.get_bounds("lat", generate="True")
print(ds.lat.units)

AttributeError: 'DataArray' object has no attribute 'units'

Add tests for consistency check between features of cdat and xcdat, and xcdat furture versions

Is your feature request related to a problem? If yes, Please describe

We need a robust system to make sure results are consistent over versions

Describe your proposed solution

Build an incremental test system (integrated in CI) that cross-check results from cdat and xcdat, with pass/fail tests. We can serve data somewhere on a data serve and download.

Describe alternatives solutions you've considered

I don't have a clear idea about design yet (maybe we can borrow some from cdat?), but strongly recommend to have this type of tests from beginning.

Additional context

Refactor algorithm to generate bounds for readability and simplicity

Is your feature request related to a problem? If yes, Please describe

The current logic for generating bounds is based cdms2.axis.getBounds() and its children methods. The logic for longitude bounds is hard to understand, even after attempting to refactor with clearer variable names.

Describe your proposed solution

Take a look at Iris's implementation called guess_bounds().

Acceptance Criteria

  • Much more legible and maintainable
  • Tests are updated
  • #82
  • Uses cf_xarray to handle CF convention names for coords and bounds

Describe alternatives solutions you've considered

Additional context

Add grid-aware geospatial averaging (weighted)

Describe the solution you'd like

Flexible (e.g., in averaging order), accurate, weighted averaging over a precise domain

Acceptance criteria:

  • Averaging options include weighted, unweighted, or custom weights
  • Assigns masked data zero-weight
  • Ensure we can convert between 0:360 longitude and -180:180 longitude
  • Allow users to specify a specific averaging region (i.e., routine calculates proper weights for sub-region)
  • Performs horizontal averaging, but can handle higher dimensional data (i.e., x(time, altitude, lat, lon) -> x(time, altitude) after averaging)
  • Can average over latitude, longitude, or both axes
  • Operates on datasets with common or cf_compliant metadata (i.e., inclusion of lat/latitude and lon/longitude)
  • Axes to be averaged can be specified in CDAT notation (i.e., axis='xy') going with "lat" and "lon" instead
  • #114 This can be done later

Additional context

CDAT equivalent:

Prototype:

Resources

Unexpected behavior with open_variable

I am using PR #95, but I think this issue affects main. When I use open_dataset and open_variable an extra dimension is added to my DataArray, which I think is happening here.

import xarray as xr
from xcdat.dataset import open_dataset
from xcdat.variable import open_variable

# open file
fn = 'tas_Amon_CESM2_historical_r1i1p1f1_gn_185001-201412.nc'
ds = open_dataset(fn)
# show dataarray shape
print('original shape', ds.tas.shape)
# show dataarray shape after calling open_variable
tas = open_variable(ds, 'tas')
print('altered shape', tas.shape)
ds.close()

original shape (1980, 192, 288)
altered shape (2, 1980, 192, 288)

What were you expecting to happen?

It appears that an extraneous nbnd dimension is added to the dataarray (and should not be).

In: tas.dims
Out: ('nbnd', 'time', 'lat', 'lon')

Any other comments?

I assume this is related to this issue + fix.

Add AxisAccessor to return boundary information

Is your feature request related to a problem? Please describe

xarray does not have a built-in method to calculate the axis boundaries if it does not exist in the dataset's data variables (lat_bnds/latitude_bnds and lon_bnds/longitude_bnds).

Describe the solution you'd like

Using @xr.register_dataset_accessor('axis'), add AxisAccessor class with method get_bounds().

Example usage:

ds = xr.open_dataset(...)
lat_bnds = ds.axis.get_bounds('lat')
lon_bnds = ds.axis.get_bounds('lon')

Describe alternatives you've considered

Additional context

Add Dataset and data variable wrappers to apply common operations upon opening

Is your feature request related to a problem? If yes, Please describe

CDAT users commonly work on a single TransientVariable of a dataset, which is equivalent to a single xarray Dataset data variable of type DataArray. However, an xarray data variable does not link to other data variables. This means that the extracted data variable does not have access to any existing axis bounds (also data variables).

Describe your proposed solution

Acceptance Criteria

  1. Add open_dataset function to open a dataset and generate bounds if they don't exist.
    • Extensible for additional operations.
    • Add call to generate bounds if it doesn't exist
  2. Add open_variable function to open a data variable with axis bounds (passed from parent Dataset). Replicates CDAT TransientVariable.
    • Extensible for additional operations
    • Add DataArray accessor to add additional attributes to DataArray class (to store bounds info)
    • Show all attributes of accessor class (just use __dict__)
    • Take into account dataset's calendar (EDIT: not needed, but keep calendar type in encoding if available)
      • I don’t think the calendar type affects the algorithm for generating a date range. Non-CF units “months since…” or “years since…” don’t factor the number of days in a month.
      • pd.date_range() generates proleptic_gregorian datetime values

Describe alternatives solutions you've considered

  • On creation of the variable (easiest for the user): tas = ds.tas
    • Not possible because this is native xarray syntax, which requires extending via subclassing (not recommended) or xarray accessors.
  • On dataset open: ds = xcdat.load_dataset(filename) [this would embed the bounds in the relevant variables so that they are included in ds.tas]
    • Not possible, I found Dataset data variables don't preserve custom attributes via DataArray accessors

Additional context

  • Extracts some tasks for #77

Initial Update

The bot created this issue to inform you that pyup.io has been set up on this repo.
Once you have closed it, the bot will open pull requests for updates as soon as they are available.

Add horizontal regridding

Blocked by #71

Describe the solution you'd like

Create time average of data with proper temporal weighting accounting for calendar peculiarities (e.g., from monthly mean, seasonal mean, or annual mean from monthly/sub-monthly data)

Acceptance criteria:

  • Decide if we want to wrap third-party library calls into utility method(s), or have user rely on library APIs directly
  • Ensure we can convert between 0:360 longitude and -180:180 longitude

Describe alternatives you've considered

  • Does SciPy do this?
  • Wrap xESMF/ESMF?
  • Wrap TempestRemap?

Additional context

CDAT/CDMS uses SCRIP2, ESMF, and regrid2

Prototype:

Update conda-recipe and conda envs

Is your feature request related to a problem? If yes, Please describe

  • Update conda-recipe build script and remove xesmf from run requirements.
  • Update conda envs to remove xesmf and add typing_extensions

Describe your proposed solution

N/A

Describe alternatives solutions you've considered

N/A

Additional context

N/A

Separate getting and adding bounds to avoid implicit side effects

Is your feature request related to a problem? If yes, Please describe

After reviewing the below lines, I realized that it is not best practice to perform implicit state change of objects, while simultaneously returning information about the object. This can lead to confusing behavior with functions/methods.
The allow_generating boolean flag performs the implicit state change.

https://github.com/tomvothecoder/xcdat/blob/62e15fb2b5ea387359a678df0f5259dd1f58ddfb/xcdat/bounds.py#L59-L111

Describe your proposed solution

Separate get_bounds() to get_bounds() and guess_bounds().

  • get_bounds() retrieves information about the Dataset (the bounds)
  • guess_bounds() changes the state of the Dataset by adding bounds

Describe alternatives solutions you've considered

N/A

Additional context

Acceptance Criteria

  • get_bounds() returns bounds for a coordinate (use cf_xarray)
  • add_bounds() adds bounds for a coordinate
    • If they exist, user must first drop existing bounds
    • If they don't exist, run _guess_bounds()
  • Add property bounds, which is a dict that maps coordinate keys to bounds DataArrays

Add attributes and properties to AxisAccessor class and refactor `get_lat_bounds()`

Is your feature request related to a problem? If yes, Please describe

Related to #39

Describe your proposed solution

Adding class attributes or properties to describe the the axis would be useful, for example if the dataset already has bounds.

  • Consider refactoring get_bounds() and child methods (specifically get_lat_bounds() which seems overly-complex)

Acceptance criteria

Describe alternatives solutions you've considered

Additional context

Add fetching of dict of bounds for a DataArray, similar to cf-xarray with Datasets

Is your feature request related to a problem? If yes, Please describe

cf-xarray only supports retrieving bounds for Datasets. xcdat adds functionality to store bounds in DataArrays, but needs a way to retrieve bounds dynamically.

Describe your proposed solution

Base the DataArray implementation of retrieving bounds similar to cf-xarray with Datasets.

bounds
get_bounds
get_bounds_dim_name

Describe alternatives solutions you've considered

Additional context

Add label assignment to Anaconda publishing job

Is your feature request related to a problem? If yes, Please describe

Release candidates should be labeled with a dev-specific label, while production releases should be released to main.

The label convention will be based on which GH repo this project will be hosted (#69) and which channel the publish the Anaconda package.

Describe your proposed solution

https://github.com/E3SM-Project/zstash/blob/e2b1ab3166e468e7a0813c84a56032e7becbb6ec/.github/workflows/release_workflow.yml#L118-L124

Describe alternatives solutions you've considered

Additional context

Update logic for adding time bounds

Is your feature request related to a problem? If yes, Please describe

Currently, time bounds are generated using the same generic bounds function as lat and lon.

Describe your proposed solution

TBD

Describe alternatives solutions you've considered

Additional context

[Feature]: Support Climatology bounds

Feature request copied from @golaz:

Is your feature request related to a problem? If yes, Please describe

Climatology files have a special type of time bounds:

    double climatology_bounds(time, nbnd) ;
            climatology_bounds:long_name = "time interval endpoints" ;
            climatology_bounds:units = "days since 0001-01-01 00:00:00" ;
            climatology_bounds:calendar = "noleap" ;

While they look like regular bounds, they are more subtle. For more details, see :http://cfconventions.org/cf-conventions/cf-conventions.html

dimensions:
  time=4;
  nv=2;
variables:
  float temperature(time,lat,lon);
    temperature:long_name="surface air temperature";
    temperature:cell_methods="time: minimum within years time: mean over years";
    temperature:units="K";
  double time(time);
    time:climatology="climatology_bounds";
    time:units="days since 1960-1-1";
  double climatology_bounds(time,nv);
data:  // time coordinates translated to date/time format
  time="1960-4-16", "1960-7-16", "1960-10-16", "1961-1-16" ;
  climatology_bounds="1960-3-1",  "1990-6-1",
                     "1960-6-1",  "1990-9-1",
                     "1960-9-1",  "1990-12-1",
                     "1960-12-1", "1991-3-1" ;

Focusing on the first set of bounds: "1960-3-1", "1990-6-1"

The climatology spans the years 1960 to 1990 (inclusive), and the months of March, April, May.

It would be best if xCDAT could handle regular time series bounds, as well as climatology bounds transparently. There will be a need for an auxiliary function that computes the time span between the bounds (to use as weights for time averaging). It’s doable, but a little trickier for climatology bounds.

One additional complexity is that climatology bounds can also apply to diurnal averages.

We want to be able to perform time averaging operations on climatology data just like regular time series data. For example:

  • Average 10 yr climo into 50 yr climo.

  • Average monthly climo into seasonal or annual.

  • Average hourly diurnal climo into 6-hourly diurnal climo.

  • Etc.

If xCDAT is designed in a flexible manner from the on-set, all of that becomes possible without any special treatment. If not, it will be a pain to add later (and lead to a more convoluted and harder to maintain code.

Fix typos in contributing guide

Describe your documentation update

There are typos in the contributing guide, and additional content should be added to this testing section.

Replace `sphinx-multiversion` with ReadtheDocs for `sphinx.apidoc` to work

Is your feature request related to a problem? Please describe

sphinx-multiversion does not run the sphinx-apidoc command which autogenerates the API documentation found in docstrings.

Describe the solution you'd like

As a result, we need to use ReadTheDocs to generate and version our docs since it allows custom commands to be run.

Move repository to parent project org and add license

Once the project is officially funded, we should move the repo to the parent project org and add a license.

Acceptance Criteria

  • Move to XCDAT organization
  • Update repo links
    • README.rst
    • CONTRIBUTING.RST
    • docs/index.rst
    • docs/getting_started.rst
    • docs/project_maintenance.rst
    • setup.py
    • tbump.toml
    • conda-recipe/meta.yaml
  • Rename xCDAT to XCDAT

[Feature]: Add vertical regridding

Blocked by #71

Describe the solution you'd like

Vertical regridder that can remap hybrid vertical coordinates to standard pressure levels

Acceptance criteria:

  • Remapping to z-levels
  • Add more....

Additional context

cdms has reconstructPressureFromHybrid, logLinearInterpolation, and linearInterpolation

Analyze regridding requirements and third-party libraries

This will help us determine the complexities and time estimates for horizontal and vertical regridding.

  1. Analyze our regridding requirements
  2. Determine if existing third-party libraries are sufficient in meeting requirements
    a. If sufficient, should we include or not include library in dependencies? Should users just install them separately in their environment?
    b. If not sufficient, can we extend library(ies) or implement regridder(s) from scratch?

Add VSCode shared IDE settings

Describe the solution you'd like

Add VSCode shared IDE settings to streamline the contributor onboarding process. There is also an .editorconfig file which universally configures different IDEs.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.