Giter Site home page Giter Site logo

echopype-checker's People

Contributors

emiliom avatar

Watchers

 avatar  avatar  avatar

Forkers

emiliom

echopype-checker's Issues

Eliminate creation of netcdf files from CDLs

In cc._process_cdl(), the nc4.Dataset.fromcdl statement creates a netcdf file for the CDL file being read. But my reading of nc4.Dataset.fromcdl suggested that no netcdf would be created when the argument ncfilename is set to None.

https://github.com/OSOceanAcoustics/sonarnetcdf4-echopype-checker/blob/6ab5ea3e485d940a83a2e0bfd66b43de931f0aa1/sonarnetcdf4_echopype_checker/cc.py#L75

This is harmless but clutters the cdls folder with netcdf files. Look into it.

Resolve failure to recognize as identical `string` type and numpy `object` type containing only strings

cc._dtype_compare() is supposed to recognize as identical (when using dtype_strict=False) string type and numpy object type containing only strings, in this block:

https://github.com/OSOceanAcoustics/sonarnetcdf4-echopype-checker/blob/6ab5ea3e485d940a83a2e0bfd66b43de931f0aa1/sonarnetcdf4_echopype_checker/cc.py#L23-L32

I tested it when I developed the scheme, and it worked. But I'm seeing one case, Platform.sentence_type is string (<U3) type in the EchoData object being tested and object type in the CDL-derived source:

image

Look into this, to ensure they're treated as equivalent.

Create instrument-specific `Vendor` group CDLs for the compliance checker

Leverage this package to help monitor changes in the Vendor group as well. Unlike all other groups, SONAR-netCDF4 v1 convention does not specify anything for the content of the group. So, all the content is based on echopype decisions.

  • @leewujung Create small netcdf files from the Vendor group from each major echopype instrument type: ek60, ek80 and azfp.
  • @emiliom Use those netcdf files to create a CDL file per group; fine-tune and prune each CDL; then incorporate each CDL into the code base

Use case to ensure data conversion outcome

We are converting large volumes of echosounder data using echopype, and would like to have something in the workflow to ensure the integrity of data conversion outcome. This package would be a natural way to do it, if we can add an API to split out some outputs to indicate if the converted data is in good order (and if not what is missing).

In particular, @Sohambutala is configuring echoflow to handle automatic and parallel data conversion, and it'd be good to have a task/flow after the conversion step to ensure data integrity. Also adding @ctuguinay here since he brought this needs up yesterday.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.