Giter Site home page Giter Site logo

ropensci / smapr Goto Github PK

View Code? Open in Web Editor NEW
79.0 13.0 25.0 6.64 MB

An R package for acquisition and processing of NASA SMAP data

Home Page: https://docs.ropensci.org/smapr

R 100.00%
nasa smap-data raster extract-data acquisition soil-moisture soil-moisture-sensor soil-mapping peer-reviewed r

smapr's Introduction

smapr

codecov CRAN_Status_Badge lifecycle Project Status: Active – The project has reached a stable, usable state and is being actively developed.

An R package for acquisition and processing of NASA (Soil Moisture Active-Passive) SMAP data

Installation

To install smapr from CRAN:

install.packages("smapr")

To install the development version from GitHub:

# install.packages("devtools")
devtools::install_github("ropensci/smapr")

Docker instructions (alternative to a local installation)

If a local installation is not possible for some reason, we have made a Docker image available with smapr and all its dependencies.

docker run -d -p 8787:8787 earthlab/smapr

In a web browser, navigate to localhost:8787 and log in with username: rstudio, password: rstudio.

Authentication

Access to the NASA SMAP data requires authentication through NASA’s Earthdata portal. If you do not already have a username and password through Earthdata, you can register for an account here: https://urs.earthdata.nasa.gov/ You cannot use this package without an Earthdata account.

Once you have an account, you need to pass your Earthdata username (ed_un) and password (ed_pw) as environmental variables that can be read from within your R session. There are a couple of ways to do this:

Recommended approach

Use set_smap_credentials('yourusername', 'yourpasswd'). This will save your credentials by default, overwriting existing credentials if overwrite = TRUE.

Alternative approaches

  • Use Sys.setenv() interactively in your R session to set your username and password (not including the < and >):
Sys.setenv(ed_un = "<your username>", ed_pw = "<your password>")
  • Create a text file .Renviron in your home directory, which contains your username and password. If you don’t know what your home directory is, execute normalizePath("~/") in the R console and it will be printed. Be sure to include a new line at the end of the file or R will fail silently when loading it.

Example .Renviron file (note the new line at the end!):

ed_un=slkdjfsldkjfs
ed_pw=dlfkjDD124^

Once this file is created, restart your R session and you should now be able to access these environment variables (e.g., via Sys.getenv("ed_un")).

SMAP data products

Multiple SMAP data products are provided by the NSIDC, and these products vary in the amount of processing. Currently, smapr primarily supports level 3 and level 4 data products, which represent global daily composite and global three hourly modeled data products, respectively. There are a wide variety of data layers available in SMAP products, including surface soil moisture, root zone soil moisture, freeze/thaw status, surface temperature, vegetation water content, vegetation opacity, net ecosystem carbon exchange, soil temperature, and evapotranspiration. NSIDC provides documentation for all SMAP data products on their website, and we provide a summary of data products supported by smapr below.

Dataset id Description Resolution
SPL2SMAP_S SMAP/Sentinel-1 Radiometer/Radar Soil Moisture 3 km
SPL3FTA Radar Northern Hemisphere Daily Freeze/Thaw State 3 km
SPL3SMA Radar Global Daily Soil Moisture 3 km
SPL3SMP Radiometer Global Soil Moisture 36 km
SPL3SMAP Radar/Radiometer Global Soil Moisture 9 km
SPL4SMAU Surface/Rootzone Soil Moisture Analysis Update 9 km
SPL4SMGP Surface/Rootzone Soil Moisture Geophysical Data 9 km
SPL4SMLM Surface/Rootzone Soil Moisture Land Model Constants 9 km
SPL4CMDL Carbon Net Ecosystem Exchange 9 km

Typical workflow

At a high level, most workflows follow these steps:

  1. Find SMAP data with find_smap()
  2. Download data with download_smap()
  3. List data contents with list_smap()
  4. Extract data with extract_smap()

Each of these steps are outlined below:

Finding SMAP data

Data are hosted on a server by the National Snow and Ice Data Center. The find_smap() function searches for specific data products and returns a data frame of available data. As data mature and pass checks, versions advance. At any specific time, not all versions of all datasets for all dates may exist. For the most up to date overview of dataset versions, see the NSIDC SMAP data version webpage.

library(smapr)
library(terra)
#> terra 1.7.18
available_data <- find_smap(id = "SPL3SMAP", date = "2015-05-25", version = 3)
str(available_data)
#> 'data.frame':    1 obs. of  3 variables:
#>  $ name: chr "SMAP_L3_SM_AP_20150525_R13080_001"
#>  $ date: Date, format: "2015-05-25"
#>  $ dir : chr "SPL3SMAP.003/2015.05.25/"

Downloading and inspecting SMAP data

Given a data frame produced by find_smap, download_smap downloads the data onto the local file system. Unless a directory is specified as an argument, the data are stored in the user’s cache.

downloads <- download_smap(available_data)
#> Downloading https://n5eil01u.ecs.nsidc.org/SMAP/SPL3SMAP.003/2015.05.25/SMAP_L3_SM_AP_20150525_R13080_001.h5
#> Downloading https://n5eil01u.ecs.nsidc.org/SMAP/SPL3SMAP.003/2015.05.25/SMAP_L3_SM_AP_20150525_R13080_001.qa
#> Downloading https://n5eil01u.ecs.nsidc.org/SMAP/SPL3SMAP.003/2015.05.25/SMAP_L3_SM_AP_20150525_R13080_001.h5.iso.xml
str(downloads)
#> 'data.frame':    1 obs. of  4 variables:
#>  $ name     : chr "SMAP_L3_SM_AP_20150525_R13080_001"
#>  $ date     : Date, format: "2015-05-25"
#>  $ dir      : chr "SPL3SMAP.003/2015.05.25/"
#>  $ local_dir: chr "~/.cache/smap"

The SMAP data are provided in HDF5 format, and in any one file there are actually multiple data sets, including metadata. The list_smap function allows users to inspect the contents of downloaded data at a high level (all = FALSE) or in depth (all = TRUE).

list_smap(downloads, all = FALSE)
#> $SMAP_L3_SM_AP_20150525_R13080_001
#>                           name group     otype dclass  dim
#> 1                     Metadata     . H5I_GROUP   <NA> <NA>
#> 2 Soil_Moisture_Retrieval_Data     . H5I_GROUP   <NA> <NA>

To see all of the data fields, set all = TRUE.

Extracting gridded data products

The extract_smap function extracts gridded data products (e.g., global soil moisture). If more than one file has been downloaded and passed into the first argument, extract_smap extracts data for each file

sm_raster <- extract_smap(downloads, "Soil_Moisture_Retrieval_Data/soil_moisture")
plot(sm_raster, main = "Level 3 soil moisture: May 25, 2015")

The path “Soil_Moisture_Retrieval_Data/soil_moisture” was determined from the output of list_smap(downloads, all = TRUE), which lists all of the data contained in SMAP data files.

Saving GeoTIFF output

The data can be saved as a GeoTIFF using the writeRaster function from the terra pacakge.

writeRaster(sm_raster, "sm_raster.tif")

Meta

ropensci_footer

smapr's People

Contributors

matt-oak avatar mbjoseph avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

smapr's Issues

Need a suggestion

I used your code for downloading the data but while extracting the downloaded data, the file size is exceeding. Can you please give me suggestion regarding the extraction of the downloaded data.
image
if you are unable to view the above link, you can refer to the link below
git.docx
Please reply as soon as possible

Feature request: find SMAP by date range

It would be nice if find_smap could be used to find all data within a time interval. The argument could still be date, but when date is a vector of length two, that could specify the boundaries of the time interval to search over.

pre-build vignette?

👋 @mbjoseph!

On rOpenSci docs server the pkgdown website of your package can't be built https://github.com/r-universe/ropensci/runs/4511976862?check_suite_focus=true

Reading 'vignettes/smapr-intro.Rmd'
-- RMarkdown error -------------------------------------------------------------
Quitting from lines 85-86 (smapr-intro.Rmd) 
Error in check_creds() : 
  smapr expected ed_un and ed_pw to be environment variables!
The smapr package requires a username and password from
NASA's Earthdata portal.

If you have a username and password please provide them with
the set_smap_credentials() function, e.g.,
set_smap_credentials('username', 'passwd')

If you do not have a username and password, get one here:
urs.earthdata.nasa.gov
--------------------------------------------------------------------------------
Error : Failed to render RMarkdown

You could precompute the vignette. Happy to answer any question (but in 2022 😁 )

subscript out of bounds error

Hi, I get the following error message.
Do you know what is the possible reason?
Thank you!

> available_data <- find_smap(id = "SPL3SMP", date = "2017-01-01", version = 4)
Error in rvest::html_table(nodes)[[1]] : subscript out of bounds

Need unit tests with user-specified target directory

smapr could use some more high level unit tests to reproduce the error reported in #26. At least something that replicates the specific error reported in that issue, but ideally additional tests with multiple data products.

Error in curl

Error in curl::curl_fetch_memory(url, handle = handle) :
Timeout was reached: [urs.earthdata.nasa.gov] Connection timed out after 10000 milliseconds

401 unauthorized response from server - username and psswd correct

I think there may be an issue with authentication and set_smap_credentials - its giving me a 401 error even though my username and password are correct and are correctly written within the sys.env().

https://n5eil01u.ecs.nsidc.org/ requires earth data login and i now believe they require a token in the Bearer header as i was able to successful download an SMAP .h5 using GET(link, add_headers(Authorization = paste("Bearer", key, sep = " ")), write_disk(save_link))

thoughts?

Add verbose argument to download_smap()

@ldecicco-USGS suggests adding some message that indicates when files are downloading so users know that things are happening (ropensci/software-review#231). This will be particularly useful if people are downloading many large files over a slow connection.

The easiest way to do this would be to add a verbose argument to download_smap that optionally prints which files are being downloaded when.

Not downloading all files

There is an issue within the download_smap.R file in regards to downloading files. Take the SPL4SMGP.001/2015.03.31 directory on the FTP server for example. At each time (13000, 43000, 73000, etc.) there are 3 files associated with each of these times (.h5, .h5.iso.xml, .qa).

We would expect download_smap.R to download all of these files within the directory on the FTP server; however, it is only downloading one file per time period such as only the .h5 file for 13000, only the .h5.iso.xml file for 43000, only the .qa file for 73000, etc.

We'll need to fix this by changing functions within download_smap.R to handle downloading all 3 files associated with each time period to a local directory.

Is the Level 3 Passive Soil Moisture product (SPL3SMP) supported?

I am trying to extract the gridded object from a "SPL3SMP" file on any date, but I always get the following error when trying to extract_smap from it:

Error in .local(.Object, ...) : Dataset copy failed

This is the code I am trying to run:

library(smapr)
library(raster)

# Look for available data
available_data <- find_smap(id = "SPL3SMP", date = "2015-10-01", version = 4)
str(available_data)

# Download and inspect metadata
downloads <- download_smap(available_data, directory="data/SMAP/SPL3SMP.004/")
str(downloads)

# Take a look inside
list_smap(downloads, all = TRUE)

# Extract gridded data and visually inspect it
sm_raster <- extract_smap(downloads, name="Soil_Moisture_Retrieval_Data_AM/soil_moisture")
plot(sm_raster, main = "Level 3 soil moisture: May 25, 2015")

Am I missing something? Perhaps this specific product has gone through a recent change that broke the code?

Function to set credentials

@marcosci pointed out a convenient way for users to provide credentials through a function like

set_credentials <- function(ed_un, ed_pw){
  write(paste0("Sys.setenv(ed_un = \"",ed_un,"\", ed_pw = \"", ed_pw, "\")"),
        file=file.path(Sys.getenv("HOME"), ".Rprofile"),append=TRUE)
}

Want to add, but will need to think about a good name. I'd like to get smap into the name somewhere, to avoid having too general a name.

Tests are too slow

The tests as currently written download many large data files. Where possible, it would be wise to instead download small data files. The SPL3SMP data is on a coarser (36km) grid, and is only ~10mb. This would be preferable to some of the 9 and 3 km grids, which can be upwards of 500mb.

It would be awesome if where possible the SPL3SMP data could be used in the tests of download_smap, list_smap, and extract_smap (it doesn't matter for find_smap).

README docker instructions

The README's docker instructions are incomplete.

Running the supplied docker command exits without failure (exit status 0), potentially leaving the user confused as to what happened.

Attempting to run the command in the foreground (remove -d flag: docker run -p 8787:8787 earthlab/smapr) reveals that I was not supplying a required environment variable, PASSWORD.

May want to link directly to the documentation for the earthlab/smapr Docker image (https://github.com/earthlab/dockerfiles/tree/master/smapr#how-to-use) in the README.

Error in Curl when Fetching Data

Here is the code I am using to download one observation of SMAP for every day within a given window:

date.code <- seq(as.Date("2015/07/26"), as.Date("2015/12/31"), by = 1)

for(i in 1:length(date.code)){
newName = paste0('SMAP_', date.code[i], '_time12.tif') #Create Unique Filename
SMAP_data <- find_smap(id= "SPL4SMAU", date = date.code[i], version = 4)
SMAP_noon <- SMAP_data[c(4),] #Index SMAP data to get 4th observation, at time = 12
SMAP_download <- download_smap(SMAP_noon) #Download SMAP t = 12 observation
SMAP_raster <- extract_smap(SMAP_download, "/Analysis_Data/sm_rootzone_analysis")
SMAP_reproj <- projectRaster(SMAP_raster, crs = "+init=epsg:4326 +proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0") #Reproject SMAP Raster into WGS84
SMAP_mask <- mask(SMAP_reproj, roi_mask) #Mask
SMAP_crop <- crop(SMAP_mask, roi_mask) #Crop
rf <- writeRaster(x=SMAP_crop, filename=newName, format = "GTiff", overwrite=T) #Store
}

It has worked many times over the past month, however occasionally I am running into the same error, which seems to occur after I have downloaded a lot of imagery for one day.

The error is this:
Error in curl::curl_fetch_disk(url, x$path, handle = handle) :
Failed writing body (2252 != 16060)

The number preceding the != operator changes between error results however.

Please let me know if you know why this is occurring! We are attempting to use the SMAPR package for a project for NASA DEVELOP and will be unable to use it if it cannot reliably retrieve SMAP imagery.

Thank you!
Max Dunsker
[email protected]

Error in validate_version(folder_names, id, version)

I was trying to use smapr but it is returning me an error. Here is the minimal reproducible example

library(smapr)
library(sp)
library(raster)

start_date <- as.Date("2015-03-31")
end_date <- as.Date("2015-04-02")
date_sequence <- seq(start_date, end_date, by = 1)

available_data <- find_smap(id = 'SPL4SMAU', 
                            dates = date_sequence, 
                            version = 4)

Error in validate_version(folder_names, id, version) :
Invalid data version. SPL4SMAU.004 does not exist at https://n5eil01u.ecs.nsidc.org/SMAP/

sessionInfo()
R version 4.1.1 (2021-08-10)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 10 x64 (build 19042)

Matrix products: default

locale:
[1] LC_COLLATE=English_India.1252 LC_CTYPE=English_India.1252
[3] LC_MONETARY=English_India.1252 LC_NUMERIC=C
[5] LC_TIME=English_India.1252

attached base packages:
[1] stats graphics grDevices utils datasets methods base

other attached packages:
[1] raster_3.4-13 sp_1.4-5 smapr_0.2.1

loaded via a namespace (and not attached):
[1] Rcpp_1.0.7 pillar_1.6.2 compiler_4.1.1 BiocManager_1.30.16
[5] rhdf5filters_1.4.0 prettyunits_1.1.1 remotes_2.4.0 tools_4.1.1
[9] testthat_3.0.4 pkgbuild_1.2.0 pkgload_1.2.1 tibble_3.1.3
[13] lubridate_1.7.10 memoise_2.0.0 lifecycle_1.0.0 rhdf5_2.36.0
[17] lattice_0.20-44 pkgconfig_2.0.3 rlang_0.4.11 cli_3.0.1
[21] curl_4.3.2 fastmap_1.1.0 stringr_1.4.0 xml2_1.3.2
[25] withr_2.4.2 httr_1.4.2 vctrs_0.3.8 generics_0.1.0
[29] desc_1.3.0 fs_1.5.0 devtools_2.4.2 rappdirs_0.3.3
[33] rprojroot_2.0.2 grid_4.1.1 glue_1.4.2 R6_2.5.1
[37] processx_3.5.2 fansi_0.5.0 sessioninfo_1.1.1 selectr_0.4-2
[41] callr_3.7.0 purrr_0.3.4 Rhdf5lib_1.14.2 magrittr_2.0.1
[45] ps_1.6.0 codetools_0.2-18 ellipsis_0.3.2 usethis_2.0.1
[49] rvest_1.0.1 utf8_1.2.2 stringi_1.7.3 cachem_1.0.5
[53] crayon_1.4.1

Add support for SMAP/Sentinel-1 data products?

The SMAP/Sentinel-1 L2 Radiometer/Radar 30-Second Scene 3 km EASE-Grid Soil Moisture data set is now available, but smapr has not been tested with this data product. There have been requests to add support for this higher resolution product.

  • Test find_smap
  • Test download_smap
  • Test list_smap
  • Test extract_smap

Unable to download SMAP_L3_SM_P_20190801_R16022_001

Writing below code and getting following error please help me
Regards
Abhishek

library(smapr)
library(raster)
find_smap(id = "SPL3SMP", date = "2019-08-01", version = 5)
name date dir
1 SMAP_L3_SM_P_20190801_R16022_001 2019-08-01 SPL3SMP.005/2019.08.01/
download_smap()
Error in validate_input_df(files) :
argument "files" is missing, with no default
download_smap("SMAP_L3_SM_P_20190801_R16022_001 2019-08-01 SPL3SMP.005/2019.08.01/")
Error in vector(mode = "list", length = n_downloads) :
invalid 'length' argument
download_smap("SMAP_L3_SM_P_20190801_R16022_001")
Error in vector(mode = "list", length = n_downloads) :
invalid 'length' argument

Error extracing NEE data

I was following instructions (https://cran.r-project.org/web/packages/smapr/vignettes/smapr-intro.html) to extract NEE data.
However, I encountered an error when I extract after downloading data.

I got following error message

> sm_raster <- extract_smap(local_files, '/NEE/nee_mean/') Error in matrix == fill_value : non-conformable arrays In addition: Warning message: In H5Aread(A) : Reading attribute data of type 'VLEN' not yet implemented. Values replaced by NA's.

Better layer names in RasterStack output

Currently, extract_smap returns a RasterStack object with uninformative layer names like tmp.1 and tmp.2:

files <- find_smap(id = "SPL3FTA", date = "2015.04.14", version = 3)
downloads <- download_smap(files)
r <- extract_smap(downloads, name = "/Freeze_Thaw_Retrieval_Data/freeze_thaw")
names(r)

It would be much better if these names were meaningful. For instance, for L4 products each layer corresponds to a specific time. The name could have something to do with that (even the full filenames minus extensions would be sufficient, e.g., SMAP_L4_SM_gph_20150331T013000_Vb1010_001

For the L3 freeze/thaw data, each layer corresponds to a filename plus a specific part of the day (there is an AM dimension and a PM dimension).

SPL2SMP_E: westBoundLongitude error in extract_smap()

Hi,

I am running into a problem trying to extract a dataset from some SMAP data, specifically, I am looking at this dataset: https://nsidc.org/data/SPL2SMP_E/versions/2

I am interested in only extracting out the soil moisture data and writing it out as a geotiff. I was able to download the files just fine using download_smap(). All of the files I need are stored in a data frame with the name, date, directory, and local directory that I created using find_smap().

However, when I go to extract the dataset using extract_smap(), it keeps throwing this error at me:

Code
# specify dates
date_begin <-paste("2018","-06-01",sep = "")
date_end <-paste("2018","-06-27",sep = "")
dates <- seq(as.Date(date_begin), as.Date(date_end), "days")

# remove one of the dates b/c it's missing
dates <- dates[-3]

# identify SMAP dataset
files <- find_smap(id = "SPL2SMP_E", dates = dates, version = 2)

# specify local directory
inputfolder = "E:/tmp"
setwd(inputfolder)

# add local directory to the files data frame as an extra column
files$local_dir <- inputfolder

# extract soil moisture
sm_raster <- extract_smap(data = files, name = "Soil_Moisture_Retrieval_Data/soil_moisture")

Error given to me by extract_smap()
Error in eval(substitute(expr), data, enclos = parent.frame()) :
object 'westBoundLongitude' not found

Any thoughts? Is it a dataset specific error?

Session Info
R version 3.4.1 (2017-06-30)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows >= 8 x64 (build 9200)

Matrix products: default

locale:
[1] LC_COLLATE=English_United States.1252 LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252 LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
[1] stats graphics grDevices utils datasets methods base

other attached packages:
[1] stringr_1.2.0 rhdf5_2.20.0 smapr_0.1.2 raster_2.5-8 sp_1.2-4

loaded via a namespace (and not attached):
[1] Rcpp_0.12.11 lattice_0.20-35 XML_3.98-1.9 rappdirs_0.3.1
[5] grid_3.4.1 R6_2.2.2 magrittr_1.5 httr_1.2.1
[9] stringi_1.1.5 zlibbioc_1.22.0 curl_3.1 xml2_1.1.1
[13] tools_3.4.1 selectr_0.3-1 compiler_3.4.1 rvest_0.3.2

Continue to check data availability when there are gaps in the dates

Thanks for this awesome package! It makes it really convenient to download and pre-process SMAP data! However, I have a suggestion to make.

I am trying to download the entire data set of the "SPL2SMP_E" product:

# Set dates
seq_dates <- seq(as.Date("2015-03-31"), as.Date("2017-08-31"), by = 1)

# Set products
av_data_l2   <- find_smap(id = "SPL2SMP_E", dates = seq_dates, version = 1)

However, I get the following error message:

Error in validate_date(id, version, date) : 
  Data are not available for this date. 2015.05.13 does not exist at https://n5eil01u.ecs.nsidc.org/SMAP/SPL2SMP.004/

Which means: if at least one day of data is unavailable on the server, the function will not proceed with the data availability check. This forces the user to create several date ranges in order to avoid gaps in the dates, which is not very convenient.

A possible solution would be to ignore the error, to proceed with the check and by the end to inform the user which dates were not available on the server, and allow the download of the remaining data anyway.

Would that be feasible?

Thanks!

Vignette expansion

Both @marcosci and @ldecicco-USGS came up with some great suggestions to improve the vignette (ropensci/software-review#231):

  • Add some more detailed post-processing examples to illustrate things like cropping, masking, or filtering the soil moisture data. Maybe include some more interesting visualizations.

  • Give some indication for how long download_smap takes. This will depend on the data type, number of files, and the users internet connection, but it's worth at least mentioning that this can be either fast or slow.

  • List the supported SMAP data types in the vignette (same table from README).

Please remove dependencies on **rgdal**, **rgeos**, and/or **maptools**

This package depends on (depends, imports or suggests) raster and one or more of the retiring packages rgdal, rgeos or maptools (https://r-spatial.org/r/2022/04/12/evolution.html, https://r-spatial.org/r/2022/12/14/evolution2.html). Since raster 3.6.3, all use of external FOSS library functionality has been transferred to terra, making the retiring packages very likely redundant. It would help greatly if you could remove dependencies on the retiring packages as soon as possible.

SEC certification issues preventing data download using smapr

Since very recently smapr::find_smap has been throwing an error related to the expiration of the SEC certification:

Error in curl::curl_fetch_memory(url, handle = handle) :
schannel: next InitializeSecurityContext failed: SEC_E_CERT_EXPIRED (0x80090328) - The received certificate has expired.

This prevents automated data retrieval using smapr. Is there a way we can circumvent this issue?

Add support for level 3 freeze thaw data

The data product SPL3FTA is currently not supported. Adding support necessitates working with a different grid (this product is restricted to the northern hemisphere upper latitudes).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.