Giter Site home page Giter Site logo

level1c4pps's People

Contributors

adybbroe avatar mraspaud avatar ninahakansson avatar pnuu avatar sfinkens avatar shornqui avatar smhi-erik avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

level1c4pps's Issues

Update MSG-4 calibration coefficients

MSG-4 calibration coefficients need a final update for CLAAS-3. From JFM:

S=21.040+0.2877e-3*day_2000 (ch1),
S=24.966+0.6074e-3*day_2000 (ch2),
S=21.236+0.1420e-3*day_2000 (ch3).

Missmatch in orbitnumber in netcdf filename and posttroll message tag 'orbit_number' VIIRS when using level 1c runner in pytroll pps runner

Problem description

When sending in VIIRS data like this: (posttroll message)

{'orig_platform_name': 'npp', 'start_time': datetime.datetime(2021, 11, 25, 8, 53, 5), 'start_decimal': 4
, 'end_time': datetime.datetime(1900, 1, 1, 8, 54, 29), 'end_decimal': 6, 'orbit_number': 52224, 'proctime': '20211125090537000265', 'stream': 'eumetcast', 'variant
': 'EARS', 'format': 'PPS', 'type': 'HDF5', 'data_processing_level': '2', 'platform_name': 'Suomi-NPP', 'tle_platform_name': 'SUOMI NPP', 'antenna': 'ears', 'origin
': '157.249.16.188:9227', 'dataset': [{'uri': '/data/pytroll/eum-sdr/GDNBO_npp_d20211125_t0853054_e0854296_b52224_c20211125090735000933_eum_ops.h5', 'uid': 'GDNBO_n
pp_d20211125_t0853054_e0854296_b52224_c20211125090735000933_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVDNB_npp_d20211125_t0853054_e0854296_b52224_c2021112509073
9000552_eum_ops.h5', 'uid': 'SVDNB_npp_d20211125_t0853054_e0854296_b52224_c20211125090739000552_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/GMODO_npp_d20211125_t08
53054_e0854296_b52224_c20211125090800000785_eum_ops.h5', 'uid': 'GMODO_npp_d20211125_t0853054_e0854296_b52224_c20211125090800000785_eum_ops.h5'}, {'uri': '/data/pyt
roll/eum-sdr/SVM01_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000875_eum_ops.h5', 'uid': 'SVM01_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000
875_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM02_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000888_eum_ops.h5', 'uid': 'SVM02_npp_d20211125_t085305
4_e0854296_b52224_c20211125090802000888_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM03_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000902_eum_ops.h5',
 'uid': 'SVM03_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000902_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM04_npp_d20211125_t0853054_e0854296_b5222
4_c20211125090802000914_eum_ops.h5', 'uid': 'SVM04_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000914_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM05_n
pp_d20211125_t0853054_e0854296_b52224_c20211125090802000931_eum_ops.h5', 'uid': 'SVM05_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000931_eum_ops.h5'}, {'
uri': '/data/pytroll/eum-sdr/SVM06_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000943_eum_ops.h5', 'uid': 'SVM06_npp_d20211125_t0853054_e0854296_b52224_c2
0211125090802000943_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM07_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000953_eum_ops.h5', 'uid': 'SVM07_npp_d
20211125_t0853054_e0854296_b52224_c20211125090802000953_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM08_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000
961_eum_ops.h5', 'uid': 'SVM08_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000961_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM09_npp_d20211125_t085305
4_e0854296_b52224_c20211125090802000970_eum_ops.h5', 'uid': 'SVM09_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000970_eum_ops.h5'}, {'uri': '/data/pytroll
/eum-sdr/SVM10_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000979_eum_ops.h5', 'uid': 'SVM10_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000979_
eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM11_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000987_eum_ops.h5', 'uid': 'SVM11_npp_d20211125_t0853054_e0
854296_b52224_c20211125090802000987_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM12_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000994_eum_ops.h5', 'ui
d': 'SVM12_npp_d20211125_t0853054_e0854296_b52224_c20211125090802000994_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM13_npp_d20211125_t0853054_e0854296_b52224_c2
0211125090803000002_eum_ops.h5', 'uid': 'SVM13_npp_d20211125_t0853054_e0854296_b52224_c20211125090803000002_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM14_npp_d
20211125_t0853054_e0854296_b52224_c20211125090803000009_eum_ops.h5', 'uid': 'SVM14_npp_d20211125_t0853054_e0854296_b52224_c20211125090803000009_eum_ops.h5'}, {'uri'
: '/data/pytroll/eum-sdr/SVM15_npp_d20211125_t0853054_e0854296_b52224_c20211125090803000016_eum_ops.h5', 'uid': 'SVM15_npp_d20211125_t0853054_e0854296_b52224_c20211
125090803000016_eum_ops.h5'}, {'uri': '/data/pytroll/eum-sdr/SVM16_npp_d20211125_t0853054_e0854296_b52224_c20211125090803000023_eum_ops.h5', 'uid': 'SVM16_npp_d2021
1125_t0853054_e0854296_b52224_c20211125090803000023_eum_ops.h5'}], 'sensor': ['viirs']}

And the resulting posttroll message from level 1c runner:

{"orig_platform_name": "npp", "start_time": "2021-11-25T08:53:05", "start_decimal": 4, "end_time": "1900-01-01T08:54:29", "end_decimal": 6, "orbit_number": 99999, "proctime": "20211125090537000265", "stream": "eumetcast", "variant": "EARS", "format": "PPS-L1C", "type": "NETCDF", "data_processing_level": "1c", "platform_name": "Suomi-NPP", "tle_platform_name": "SUOMI NPP", "antenna": "ears", "origin": "157.249.16.188:9227", "sensor": ["viirs"], "orig_orbit_number": 52224, "uri": "ssh://pps-v2021-a/data/pytroll/nwcsaf/import/IMAGER_data/S_NWC_viirs_npp_00000_20211125T0853054Z_20211125T0854296Z.nc", "uid": "S_NWC_viirs_npp_00000_20211125T0853054Z_20211125T0854296Z.nc"}

What I see in the code is https://github.com/foua-pps/level1c4pps/blob/master/level1c4pps/__init__.py#L383 where orbit_number is hardcoded to 0. This is then used in the filename: https://github.com/foua-pps/level1c4pps/blob/master/level1c4pps/__init__.py#L528-L536

But the message orbit_number is not used. That is, it is not possible to pass this to this module https://github.com/foua-pps/level1c4pps/blob/master/level1c4pps/viirs2pps_lib.py#L127-L128

So my problem it is not possible to control the use of orbit_number in the netcdf filename when calling this from eg level1c runner in pytroll pps runner.

Problem saving gac data with satpy > 0.37

Code Sample, a minimal, complete, and verifiable piece of code

# Your code here
pytest level1c4pps

Problem description

Test fails with satpy version 0.38 and 0.39

Expected Output

Tests OK

Actual Result, Traceback if applicable

File "/home/opt/CONDA/envs/ppsepssg/lib/python3.8/site-packages/satpy/writers/cf_writer.py", line 737, in _try_get_units_from_coords
if "units" in new_data.coords[c].attrs:
File "/home/opt/CONDA/envs/ppsepssg/lib/python3.8/site-packages/xarray/core/coordinates.py", line 333, in getitem
return self._data._getitem_coord(key)
File "/home/a001865/opt/CONDA/envs/ppsepssg/lib/python3.8/site-packages/xarray/core/dataarray.py", line 730, in _getitem_coord
_, key, var = _get_virtual_variable(
File "/home/opt/CONDA/envs/ppsepssg/lib/python3.8/site-packages/xarray/core/dataset.py", line 169, in _get_virtual_variable
ref_var = variables[ref_name]
KeyError: 'x'

Versions of Python, package at hand and relevant dependencies

Satpy 0.38 or 0.39. Version 0.27 is OK

Thank you for reporting an issue!

FOr AVHRR data missing ch3a/b gives no image5 dataset

I'm not sure if this is a feature, enhancement or bug.

AVHHR l1b only gives one of the ch3a/b channels and this sometimes gives netcdf files with missing image5 dataset.

This is fine for most purposes as far as I can understand.

But for HRW processing the individual segments from AVHRR needs to concatenate the channels into one netcdf file.

When some of the AVHRR netcdf segments are missing the image5 and some are not, the HRW prepare fails to concatenate these as the number of image datasets differs.

So again, maybe this needs to be fixed in the HRW preprocessing.

When I generate AVHRR netcdf files I do like this:

from level1c4pps.avhrr2pps_lib import process_one_scene as process_avhrr
process_avhrr(<l1b aapp file>, <out_directory>)

For dataset with ch3a/b I get:

hrpt_metop01_20220204_0740_48682.l1b 
level1c4pps INFO: |08:06:58|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:06:58|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:06:58|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:06:59|: Saving datasets to NetCDF4/CF.
/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/writers/cf_writer.py:572: FutureWarning: The default behaviour of the CF writer will soon change to not compress data by default.
  FutureWarning)
/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/dask/core.py:119: RuntimeWarning: divide by zero encountered in true_divide
  return func(*(_execute_task(a, cache) for a in args))
Saved file S_NWC_avhrr_metopb_00000_20220204T0740000Z_20220204T0740599Z.nc after 3.5 seconds

Giving netcdf

h5dump -H S_NWC_avhrr_metopb_48682_20220204T0740000Z_20220204T0740599Z.nc | grep DATASET
   DATASET "azimuthdiff" {
   DATASET "bnds_1d" {
   DATASET "image0" {
   DATASET "image1" {
   DATASET "image2" {
   DATASET "image3" {
   DATASET "image4" {
   DATASET "image5" {
   DATASET "lat" {
   DATASET "lon" {
   DATASET "satzenith" {
   DATASET "sunzenith" {
   DATASET "time" {
   DATASET "time_bnds" {
   DATASET "x" {
   DATASET "y" {

For dataset without one of the ch3a/b:

hrpt_metop01_20220204_0757_48682.l1b
level1c4pps INFO: |08:05:42|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:05:42|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:05:42|: No valid operational coefficients, fall back to pre-launch
level1c4pps ERROR: |08:05:42|: Could not load dataset 'DataID(name='3b', wavelength=WavelengthRange(min=3.55, central=3.74, max=3.93, unit='µm'), resolution=1050, calibration=<calibration.brightness_temperature>, modifiers=())': "Could not load DataID(name='3b', wavelength=WavelengthRange(min=3.55, central=3.74, max=3.93, unit='µm'), resolution=1050, calibration=<calibration.brightness_temperature>, modifiers=()) from any provided files"
Traceback (most recent call last):
  File "/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/readers/yaml_reader.py", line 848, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
  File "/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/readers/yaml_reader.py", line 720, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
  File "/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/readers/yaml_reader.py", line 706, in _load_dataset
    "Could not load {} from any provided files".format(dsid))
KeyError: "Could not load DataID(name='3b', wavelength=WavelengthRange(min=3.55, central=3.74, max=3.93, unit='µm'), resolution=1050, calibration=<calibration.brightness_temperature>, modifiers=()) from any provided files"
level1c4pps WARNING: |08:05:43|: The following datasets were not created and may require resampling to be generated: DataID(name='3b', wavelength=WavelengthRange(min=3.55, central=3.74, max=3.93, unit='µm'), resolution=1050, calibration=<calibration.brightness_temperature>, modifiers=())
level1c4pps INFO: |08:05:44|: Saving datasets to NetCDF4/CF.
/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/writers/cf_writer.py:572: FutureWarning: The default behaviour of the CF writer will soon change to not compress data by default.
  FutureWarning)
Saved file S_NWC_avhrr_metopb_00000_20220204T0757000Z_20220204T0757599Z.nc after 3.5 seconds

Giving netcdf

h5dump -H S_NWC_avhrr_metopb_48682_20220204T0757000Z_20220204T0757599Z.nc | grep DATASET
   DATASET "azimuthdiff" {
   DATASET "bnds_1d" {
   DATASET "image0" {
   DATASET "image1" {
   DATASET "image2" {
   DATASET "image3" {
   DATASET "image4" {
   DATASET "lat" {
   DATASET "lon" {
   DATASET "satzenith" {
   DATASET "sunzenith" {
   DATASET "time" {
   DATASET "time_bnds" {
   DATASET "x" {
   DATASET "y" {

Can you please comment on this, then I can decide how to solve this.

Seviri2pps fails with h5netcdf backend

seviri2pps fails to write output with the h5netcdf backend. The netcdf4 backend works fine. I'm using the latest master of level1c4pps with satpy-0.24.0 and h5netcdf-0.8.1. Could also be a problem with Satpy's CF writer.

To reproduce:

$ seviri2pps.py /data/seviri_l1b_hrit/*20190409* -o /tmp

level1c4pps INFO: |09:19:23|: Saving datasets to NetCDF4/CF.
Traceback (most recent call last):
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/bin/seviri2pps.py", line 7, in <module>
    exec(compile(f.read(), __file__, 'exec'))
  File "/cmsaf/nfshome/sfinkens/software/devel/level1c4pps/level1c4pps/bin/seviri2pps.py", line 58, in <module>
    engine=options.nc_engine)
  File "/cmsaf/nfshome/sfinkens/software/devel/level1c4pps/level1c4pps/level1c4pps/seviri2pps_lib.py", line 461, in process_one_scan
    exclude_attrs=['raw_metadata'])
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/satpy/scene.py", line 1373, in save_datasets
    return writer.save_datasets(datasets, compute=compute, **save_kwargs)
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/satpy/writers/cf_writer.py", line 717, in save_datasets
    **other_to_netcdf_kwargs)
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/xarray/core/dataset.py", line 1567, in to_netcdf
    invalid_netcdf=invalid_netcdf,
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/xarray/backends/api.py", line 1082, in to_netcdf
    dataset, store, writer, encoding=encoding, unlimited_dims=unlimited_dims
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/xarray/backends/api.py", line 1128, in dump_to_store
    store.store(variables, attrs, check_encoding, writer, unlimited_dims=unlimited_dims)
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/xarray/backends/common.py", line 255, in store
    variables, check_encoding_set, writer, unlimited_dims=unlimited_dims
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/xarray/backends/common.py", line 293, in set_variables
    name, v, check, unlimited_dims=unlimited_dims
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/xarray/backends/h5netcdf_.py", line 288, in prepare_variable
    **kwargs,
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/h5netcdf/core.py", line 501, in create_variable
    fillvalue, **kwargs)
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/h5netcdf/core.py", line 481, in _create_child_variable
    **kwargs)
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/h5py/_hl/group.py", line 148, in create_dataset
    dsid = dataset.make_new_dset(group, shape, dtype, data, name, **kwds)
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/h5py/_hl/dataset.py", line 110, in make_new_dset
    maxshape, scaleoffset, external, allow_unknown_filter)
  File "/cmsaf/cmsaf-ops3/sfinkens/conda/envs/level1c4pps/lib/python3.7/site-packages/h5py/_hl/filters.py", line 249, in fill_dcpl
    plist.set_chunk(chunks)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py/h5p.pyx", line 448, in h5py.h5p.PropDCID.set_chunk
ValueError: chunksize must be a tuple.

Unittest fails for pygac>1.3.0

The unittest fails with version of pygac > 1.3.0. As the cropped GAC file can no longer be read.

Problem will be fixed in for next version of pygac.

The small cropped GAC file was created by cropping the first part of a GAC file:
dd if=NSS.GHRR.TN.D80003.S1147.E1332.B0630506.GC of=NSS.GHRR.TN.D80003.S1147.E1332.B0630506.GC_C bs=10 count=6000

The error message:

../home/a001865/opt/CONDA_JUNE/envs/pps2018patch3_level1c4pps/lib/python3.6/importlib/_bootstrap.py:219: RuntimeWarning: numpy.ufunc size changed, may indicate binary incompatibility. Expected 192 from C header, got 216 from PyObject
  return f(*args, **kwds)
Could not load dataset 'DatasetID(name='latitude', wavelength=None, resolution=1050, polarization=None, calibration=None, level=None, modifiers=())': buffer is smaller than requested size
Traceback (most recent call last):
  File "/home/a001865/opt/CONDA_JUNE/envs/pps2018patch3_level1c4pps/lib/python3.6/site-packages/satpy/readers/yaml_reader.py", line 782, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
  File "/home/a001865/opt/CONDA_JUNE/envs/pps2018patch3_level1c4pps/lib/python3.6/site-packages/satpy/readers/yaml_reader.py", line 667, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
  File "/home/a001865/opt/CONDA_JUNE/envs/pps2018patch3_level1c4pps/lib/python3.6/site-packages/satpy/readers/yaml_reader.py", line 643, in _load_dataset
    projectable = fh.get_dataset(dsid, ds_info)
  File "/home/a001865/opt/CONDA_JUNE/envs/pps2018patch3_level1c4pps/lib/python3.6/site-packages/satpy/readers/avhrr_l1b_gaclac.py", line 126, in get_dataset
    self.reader.read(self.filename)
  File "/home/a001865/opt/CONDA_JUNE/envs/pps2018patch3_level1c4pps/lib/python3.6/site-packages/pygac/pod_reader.py", line 257, in read
    count=self.head["number_of_scans"])
ValueError: buffer is smaller than requested size

Use satpy to rotate SEVIRI image

Satpy now offers the possibility to rotate GEO images as follows:

scn.load(['VIS008'], upper_right_corner='NE')

This could replace level1c4pps.seviri2pps_lib.rotate_band().

Make time dimension unlimited in seviri2pps

This is just one little change in seviri2pps_lib.py

scn_.save_datasets(..., unlimited_dims=['time'])

However, that requires the satpy CF writer to be updated (remove unlimited_dims from the kwargs when creating the file).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.