Giter Site home page Giter Site logo

ilovesweetpickles / stars Goto Github PK

View Code? Open in Web Editor NEW

This project forked from r-spatial/stars

0.0 1.0 0.0 37.77 MB

Spatiotemporal Arrays, Raster and Vector Data Cubes

Home Page: https://r-spatial.github.io/stars/

License: Apache License 2.0

R 92.10% Shell 0.03% TeX 4.76% Dockerfile 3.10%

stars's Introduction

Spatiotemporal Arrays: Raster and Vector Datacubes

Build Status AppVeyor Build Status codecov CRAN cran checks Downloads

Spatiotemporal data often comes in the form of dense arrays, with space and time being array dimensions. Examples include

  • socio-economic or demographic data,
  • environmental variables monitored at fixed stations,
  • raster maps
  • time series of satellite images with multiple spectral bands,
  • spatial simulations, and
  • climate or weather model output.

This R package provides classes and methods for reading, manipulating, plotting and writing such data cubes, to the extent that there are proper formats for doing so.

Raster and vector data cubes

The canonical data cube most of us have in mind is that where two dimensions represent spatial raster dimensions, and the third time (or band), as e.g. shown here:

By data cubes however we also consider higher-dimensional cubes (hypercubes) such as a five-dimensional cube where in addition to time, spectral band and sensor form dimensions:

or lower-dimensional cubes such as a raster image:

suppressPackageStartupMessages(library(dplyr))
library(stars)
# Loading required package: abind
# Loading required package: sf
# Linking to GEOS 3.7.2, GDAL 2.4.2, PROJ 5.2.0
tif = system.file("tif/L7_ETMs.tif", package = "stars")
read_stars(tif) %>%
  slice(index = 1, along = "band") %>%
  plot()

Raster data do not need to be regular and aligned with North/East, and package stars supports besides regular also rotated, sheared, rectilinear and curvilinear rasters:

Vector data cubes arise when we do not have two regularly discretized spatial dimensions, but a single dimension indicating spatial feature geometries, such as polygons (e.g. denoting administrative regions):

or points (e.g. denoting sensor locations):

NetCDF, GDAL

stars provides two functions to read data: read_ncdf and read_stars, where the latter reads through GDAL. (In the future, both will be integrated in read_stars.) For reading NetCDF files, package RNetCDF is used, for reading through GDAL, package sf provides the binary linking to GDAL.

For vector and raster operations, stars uses as much as possible the routines available in GDAL and PROJ (e.g. st_transform, rasterize, polygonize, warp). Read more about this in the vignette on vector-raster conversions, reprojection, warping.

Out-of-memory (on-disk) rasters

Package stars provides stars_proxy objects (currently only when read through GDAL), which contain only the dimensions metadata and pointers to the files on disk. These objects work lazily: reading and processing data is postponed to the moment that pixels are really needed (at plot time, or when writing to disk), and is done at the lowest spatial resolution possible that still fulfills the resolution of the graphics device. More details are found in the stars proxy vignette.

The following methods are currently available for stars_proxy objects:

methods(class = "stars_proxy")
#  [1] [              adrop          aggregate      aperm         
#  [5] as.data.frame  c              coerce         dim           
#  [9] initialize     Math           merge          Ops           
# [13] plot           predict        print          show          
# [17] slotsFromS3    split          st_apply       st_as_stars   
# [21] st_crop        st_redimension write_stars   
# see '?methods' for accessing help and source code

Raster and vector time series analysis example

In the following, a curvilinear grid with hourly precipitation values of a hurricane is imported and the first 12 time steps are plotted:

prec_file = system.file("nc/test_stageiv_xyt.nc", package = "stars")
(prec = read_ncdf(prec_file, curvilinear = c("lon", "lat"), ignore_bounds = TRUE))
# no 'var' specified, using Total_precipitation_surface_1_Hour_Accumulation
# other available variables:
#  time_bounds, lon, lat, time
# No projection information found in nc file. 
#  Coordinate variable units found to be degrees, 
#  assuming WGS84 Lat/Lon.
# stars object with 3 dimensions and 1 attribute
# attribute(s):
#  Total_precipitation_surface_1_Hour_Accumulation [kg/m^2]
#  Min.   :  0.000                                         
#  1st Qu.:  0.000                                         
#  Median :  0.750                                         
#  Mean   :  4.143                                         
#  3rd Qu.:  4.630                                         
#  Max.   :163.750                                         
# dimension(s):
#      from  to                  offset   delta                       refsys
# x       1  87                      NA      NA +proj=longlat +datum=WGS8...
# y       1 118                      NA      NA +proj=longlat +datum=WGS8...
# time    1  23 2018-09-13 18:30:00 UTC 1 hours                      POSIXct
#      point                         values    
# x       NA [87x118] -80.6113,...,-74.8822 [x]
# y       NA   [87x118] 32.4413,...,37.6193 [y]
# time    NA                           NULL    
# curvilinear grid
sf::read_sf(system.file("gpkg/nc.gpkg", package = "sf"), "nc.gpkg") %>%
  st_transform(st_crs(prec)) -> nc # transform from NAD27 to WGS84
nc_outline = st_union(st_geometry(nc))
plot_hook = function() plot(nc_outline, border = 'red', add = TRUE)
prec %>%
  slice(index = 1:12, along = "time") %>%
  plot(downsample = c(5, 5, 1), hook = plot_hook)

and next, intersected with with the counties of North Carolina, where the maximum precipitation intensity was obtained per county, and plotted:

a = aggregate(prec, by = nc, FUN = max)
# although coordinates are longitude/latitude, st_intersects assumes that they are planar
# although coordinates are longitude/latitude, st_intersects assumes that they are planar
plot(a, max.plot = 23, border = 'grey', lwd = .5)

We can integrate over (reduce) time, for instance to find out when the maximum precipitation occurred. The following code finds the time index, and then the corresponding time value:

index_max = function(x) ifelse(all(is.na(x)), NA, which.max(x))
st_apply(a, "geometry", index_max) %>%
  mutate(when = st_get_dimension_values(a, "time")[.$index_max]) %>%
  select(when) %>%
  plot(key.pos = 1, main = "time of maximum precipitation")

Other packages for data cubes

Package gdalcubes can be used to create data cubes (or functions from them) from image collections, sets of multi-band images with varying

  • spatial resolution
  • spatial extent
  • coordinate reference systems (e.g., spread over multiple UTM zones)
  • observation times

and does this by resampling and/or aggregating over space and/or time. It reuses GDAL VRT’s and gdalwarp for spatial resampling and/or warping, and handles temporal resampling or aggregation itself.

ncdfgeom reads and writes vector data cubes from and to netcdf files in a standards-compliant way.

Package raster is a powerful package for handling raster maps and stacks of raster maps both in memory and on disk, but does not address

  • non-raster time series,
  • multi-attribute rasters time series
  • rasters with mixed type attributes (e.g., numeric, logical, factor, POSIXct)
  • rectilinear or curvilinear rasters

A list of stars commands matching existing raster commands is found in this wiki. A list of translations in the opposite direction (from stars to raster) still needs to be made.

Other stars resources:

Acknowledgment

This project has been realized with financial support from the

stars's People

Contributors

dblodgett-usgs avatar djnavarro avatar edzer avatar etiennebr avatar flahn avatar jeroen avatar joshobrien avatar kendonb avatar mdsumner avatar nowosad avatar pat-s avatar przell avatar rekyt avatar uribo avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.