oggm / agile Goto Github PK
View Code? Open in Web Editor NEWAGILE: Open Global Glacier Data Assimilation Framework
License: BSD 3-Clause "New" or "Revised" License
AGILE: Open Global Glacier Data Assimilation Framework
License: BSD 3-Clause "New" or "Revised" License
Currently, COMBINE only saves the resulting flowline, representing today's glacier state (which can be used as a starting point for projections).
But maybe it would be a good idea to also save the complete glacier evolution from the start time up to today (similar to OGGMs model_diagnostics
for area and volume or model_geometry
for hydrological applications). For this COMBINE can conduct at the end one OGGM run_until_and_store
using all of the final parameters.
More advanced gradient based optimization methods like L-BFGS-B and CG seem to have problems estimating step widths and using the gradients as the validity range for these seems to be underestimated.
Maybe a naive approach would be more efficient like
This idea came to my mind, as the gradients often hint into the right direction for optimization, but the applied bed update is not corresponding to the gradient ...
COMBINE uses the same dynamic core as OGGM, which is known to create numeric instabilities under certain (not well-known) conditions. These instabilities cause the calculated gradients of the control variables, which are located insight the instability, to be magnitudes larger than control variables outside of the instability. This leads to a failure of the minimisation algorithm.
An idea to cope with this is to use a filter on the gradients before they are returned to the minimisation algorithm (idea from discussion with @dngoldberg).
Or another idea is to try out a flux limiter which dynamically switches between upwinding and Lax-Wendroff (idea from discussion with @jrmaddison).
Let's see if one of the two ideas can provide a solution. I will update this Issue with the things I have tried.
Second order derivatives are in theory possible using PyTorch (via Hessian Vector Product) and could improve gradient-based optimzation (L-BFGS-B, Newton-CG, ...)
Some functionalities within core OGGM have changed, but are used by COMBINE. This calls for updates
It is your choice of course, but if we do it we should do it before your thesis is submitted
Out since one month: https://github.com/google/jax
Twitter is divided about the actual differences with pytorch, but the clear scope of JAX makes it actually worth a try (maybe faster than pytorch?)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.