Giter Site home page Giter Site logo

oggm / agile Goto Github PK

View Code? Open in Web Editor NEW
10.0 6.0 2.0 50.8 MB

AGILE: Open Global Glacier Data Assimilation Framework

License: BSD 3-Clause "New" or "Revised" License

Python 19.50% Jupyter Notebook 80.48% Dockerfile 0.01% Shell 0.01%
inversion bedrock-inversion oggm cost-optimization data-assimilation glacier-modelling

agile's People

Contributors

fmaussion avatar pat-schmitt avatar phigre avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

agile's Issues

Add final glacier evolution to Output

Currently, COMBINE only saves the resulting flowline, representing today's glacier state (which can be used as a starting point for projections).

But maybe it would be a good idea to also save the complete glacier evolution from the start time up to today (similar to OGGMs model_diagnostics for area and volume or model_geometry for hydrological applications). For this COMBINE can conduct at the end one OGGM run_until_and_store using all of the final parameters.

Include naive gradient based optimization

More advanced gradient based optimization methods like L-BFGS-B and CG seem to have problems estimating step widths and using the gradients as the validity range for these seems to be underestimated.
Maybe a naive approach would be more efficient like

This idea came to my mind, as the gradients often hint into the right direction for optimization, but the applied bed update is not corresponding to the gradient ...

Instabilites of dynamic core causes problems with AD

COMBINE uses the same dynamic core as OGGM, which is known to create numeric instabilities under certain (not well-known) conditions. These instabilities cause the calculated gradients of the control variables, which are located insight the instability, to be magnitudes larger than control variables outside of the instability. This leads to a failure of the minimisation algorithm.

An idea to cope with this is to use a filter on the gradients before they are returned to the minimisation algorithm (idea from discussion with @dngoldberg).

Or another idea is to try out a flux limiter which dynamically switches between upwinding and Lax-Wendroff (idea from discussion with @jrmaddison).

Let's see if one of the two ideas can provide a solution. I will update this Issue with the things I have tried.

Improve code quality

  • Create and incorporate naming conventions
  • Add documentation
  • Use consequently os.path.join for filenames
  • Restructure code and source files into logical units
  • Get closer to OGGM GlacierDirectory Idea (save e.g. spinup states, ...)
  • ...

Think about second order derivatives

Second order derivatives are in theory possible using PyTorch (via Hessian Vector Product) and could improve gradient-based optimzation (L-BFGS-B, Newton-CG, ...)

JAX

Out since one month: https://github.com/google/jax

Twitter is divided about the actual differences with pytorch, but the clear scope of JAX makes it actually worth a try (maybe faster than pytorch?)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.