Giter Site home page Giter Site logo

chm's People

Contributors

andrebertoncini avatar chrismarsh avatar kevinrichardgreen avatar nicwayand avatar phillip-harder avatar shawn-mcadam avatar vvionnet avatar wbbssr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chm's Issues

Unique model log output

Currently writes out to CHM.log.

Would be useful to have it unique to that run. Especially for large ensemble runs.

Put the CHM.log in the output_dir. Optional change its name to output_dir.log (or other naming scheme).

License

Code needs a license. Must be compatible with all the existing code.

BSD perhaps?

Implement user refinement in Triangle

Triangle allows for a user-supplied refinement function.

you write C code that examines a triangle's coordinates and area, and decides whether to refine the triangle or not.

https://www.cs.cmu.edu/~quake/triangle.u.html

I want the ability to use more than just area to produce triangles. For now, I would like to use:

  • triangle area
  • triangle tolerance [ difference between mean triangle elevation and underlying raster ]
  • @NicWayand, you suggested standard deviation of terrain. Can you elaborate on what you mean by this?

Tolerance: Imagine a planar side-view of the triangle plane acting as a linear interpolant. The linear interpolant will overlapping a set of raster cells and will either extend above or below the raster cells. The tolerance can then be though of as the error between the triangle and the underlying cells.

Thus it allows for controlling the size of the triangle in areas of steep topography.

For all of this discussion, I am assuming triunsuitable() gets a triangle t as input that has x,y coordinates. There will be no elevation, etc.

We can allow the user to specify a max area, as is currently used. The default should be much much higher, as we can spatially refine as follows. We can then allow the user to specify a max tolerance (meters).

It will be somewhat computationally expensive to do this. We can do this the same way mesher extracts parameters. Mesher produces a window into a raster that corresponds to the area the triangle intersects via gdal.

We can do the same to extract the raster DEM subarea that t overlaps. This lets us get at the elevation. Then we can compute the maximum error imposed by our triangular interpolator. Then, we can choose to refine the triangle if we exceed this tolerance.

Check that parameter meshes are the correct size

If a module writes out a parameter, and that is used to init the mesh AND the DEM mesh has changed, there is no safe guard to prevent this.

Perhaps make a hash of the base DEM to ensure it matches?
Need to at least check the size of the parameter mesh

Mesher fails if given paths

If paths to DEMS/&c are given to mesher, such as

parameter_files = {
    'landcover': {'file': 'dems/eosd.tif',
                  'method': 'mode'},  # mode, mean
    'svf': {'file': 'dems/wolf_svf1.tif',
            'method': 'mean'
            }
}

mesher fails to properly deal with the path

ERROR 4: Attempt to create new tiff file `dems/wolf_lidar1/dems/wolf_lidar1_projected.tif' failed: No such file or directory
Creating output file that is 1089P x 867L.
Traceback (most recent call last):
  File "/home/chris/Documents/PhD/code/CHM/tools/mesher/main.py", line 617, in <module>
    main()
  File "/home/chris/Documents/PhD/code/CHM/tools/mesher/main.py", line 188, in main
    subprocess.check_call(['gdalwarp %s %s -overwrite -dstnodata -9999 -t_srs "EPSG:%d"' % (dem_filename, base_dir + base_name+'_projected.tif',EPSG)], shell=True)
  File "/usr/lib64/python2.7/subprocess.py", line 540, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['gdalwarp dems/wolf_lidar1.tif dems/wolf_lidar1/dems/wolf_lidar1_projected.tif -overwrite -dstnodata -9999 -t_srs "EPSG:26908"']' returned non-zero exit status 1

output netcdf file

netcdf files will be easier to work with than individual tiffs, especially for multiple year/scenario simulations. Also should be more space efficient.

Should be able to use the same vtu2geo tool, add option to write out to netcdf (using xarray module).

Optional length of time for individual netcdf file: time step (not efficient), 1 day, 1 month, 1 year, etc...

Remove user configuration from /tools/mesher/main.py

I suggest changing main.py to take an imput configuration file. That way, the code is separate from the basin/domain specific configuration info. Will make versioning simpler as well because user's configurations will not have to be committed.

-9999 missing value

Confirm that -9999 in input met files are properly treated as missing values and nan("") is subbed in

Masking triangle configuration option

Allow as an optional input within the CHM.json configuration file, to pass a raster(?) mask of which triangles to run on (1's) and to not run. Default behavior if a mask is not provided should be to run all triangles. This will allow a user to mask out non-basin areas, or triangles near the edges that out not optimal shapes.

Move hard coded interpolation parameters into configuration

Where do the lapse rate values used in met interpolation come from?

Which Thornton paper?

double monthly_factors[] = {0.35, 0.35, 0.35, 0.30, 0.25, 0.20, 0.20, 0.20, 0.20, 0.25, 0.30, 0.35};

Which Liston paper?

These values can vary greatly depending on the region.

I propose two options:

  1. Allow user to define it if known for a region (in configuration file), other wise default to this literature(?) values.
  2. Calculate from available stations (could be tricky depending on the station coverage). Would work well if using mesoscale output.

Thoughts?

Metfiles require same columns

There is an assumption that all met files have the same columns. If this isn't the case, there is a segfault on the first time step. There should at least be a check for this.

Lat / Long support

Experimental support is now enabled for lat/long input files in commit 65693a0.

Current problems

Make error message for dependency more understandable

For example:

/opt/boost/boost_1_61_0/include/boost/graph/topological_sort.hpp(41): Throw in function void boost::topo_sort_visitor<OutputIterator>::back_edge(const Edge&, Graph&) [with Edge = boost::detail::edge_desc_impl<boost::directed_tag, long unsigned int>; Graph = const boost::adjacency_list<boost::vecS, boost::vecS, boost::directedS, boost::property<boost::vertex_index_t, int, vertex>, edge>; OutputIterator = std::front_insert_iterator<std::deque<int> >] Dynamic exception type: boost::exception_detail::clone_impl<boost::exception_detail::error_info_injector<boost::not_a_dag> > std::exception::what: The graph must be a DAG.

Add check time steps are continuous

Potential bug: CHM appears to run without error if forcing time steps are not continuous in time. For example: I am missing Aug 20th GEM forcing, in the period of 18th to 21st. I would expect this should throw an error.

Module dependency

Modules need to be able to specify dependency upon variables. These variables may be output from modules, therefore inter-module dependencies are required

Output dir of output point files

Currently the path to output files is assumed to be relative to the run dir
"file": "BNS_out.txt",
This is an issue when you want to run multiple configurations modified via command line (it would require N station command line changes for each output file.

Suggest adding new variable point_out_dir where point files are written.

@Chrismarsh is there already something like this? If not I will address.

segfault when command line options incorrect

[debug]: Set project name to downscal_exp
[debug]: per triangle timer series storage: false
./EXP_1.sh: line 7: 17100 Segmentation fault (core dumped) ./CHM -f GEM_west_fortress.json -c config.interpolant:"nearest"

Require input configuration file on the command line

Remove the default behaviour which will run the code with CHM.json and require the user explicitly specify a configuration file.

Motivation is that it would remove the edge case where you accidentally run the wrong configuration file.

Running slope_iswr in point mode

New Simple_Canopy module needs the iswr_diffuse variable from slope_iswr, but slope_iswr is not run in point mode. Is there any simple way around this to enable point testing of Simple_Canopy?

Documentation is lacking

Currently exists but is lacking:

  • Code documentation is via Doxygen

Need to do:

  • Users' guide
  • Module development guide
  • Module guide
  • Development guide

Currently thinking a LaTeX document, but I wonder if a github wiki is perhaps a better location for this?

Water/Land/other mask required as parameter in mesher

Add to mesher:

Land/water parameter. That way certain modules can be constrained to only run over particular types (i.e. no snow module over oceans (without ice)), or lake/glacier modules only running over those triangle types.)

Time averaging

Splitting #14

It would be good to have a module that calculates the time average of fluxes to diagnose the seasonality of various fluxes.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.