Giter Site home page Giter Site logo

shapefile's Introduction

Streaming Shapefile Parser

In Node:

var shapefile = require("shapefile");

shapefile.open("example.shp")
  .then(source => source.read()
    .then(function log(result) {
      if (result.done) return;
      console.log(result.value);
      return source.read().then(log);
    }))
  .catch(error => console.error(error.stack));

In a browser:

<!DOCTYPE html>
<script src="https://unpkg.com/[email protected]"></script>
<script>

shapefile.open("https://cdn.rawgit.com/mbostock/shapefile/master/test/points.shp")
  .then(source => source.read()
    .then(function log(result) {
      if (result.done) return;
      console.log(result.value);
      return source.read().then(log);
    }))
  .catch(error => console.error(error.stack));

</script>

In a terminal:

shp2json example.shp

For a live example, see bl.ocks.org/2dd741099154a4da55a7db31fd96a892. See also ndjson-cli for examples of manipulating GeoJSON using newline-delimited JSON streams. See Command-Line Cartography for a longer introduction.

This parser implementation is based on the ESRI Shapefile Technical Description, dBASE Table for ESRI Shapefile (DBF) and Data File Header Structure for the dBASE Version 7 Table File. Caveat emptor: this is a work in progress and does not currently support all shapefile geometry types. It only supports dBASE III and has little error checking. Please contribute if you want to help!

In-browser parsing of dBASE table files requires TextDecoder, part of the Encoding living standard, which is not supported in IE or Safari as of September, 2016. See text-encoding for a browser polyfill.

TypeScript definitions are available in DefinitelyTyped: typings install dt~shapefile.

API Reference

# shapefile.read(shp[, dbf[, options]]) <>

Returns a promise that yields a GeoJSON feature collection for specified shapefile shp and dBASE table file dbf. The meaning of the arguments is the same as shapefile.open. This is a convenience API for reading an entire shapefile in one go; use this method if you don’t mind putting the whole shapefile in memory. The yielded collection has a bbox property representing the bounding box of all records in this shapefile. The bounding box is specified as [xmin, ymin, xmax, ymax], where x and y represent longitude and latitude in spherical coordinates.

The coordinate reference system of the feature collection is not specified. This library does not support parsing coordinate reference system specifications (.prj); see Proj4js for parsing well-known text (WKT) specifications.

# shapefile.open(shp[, dbf[, options]]) <>

Returns a promise that yields a GeoJSON Feature source.

If typeof shp is “string”, opens the shapefile at the specified shp path. If shp does not have a “.shp” extension, it is implicitly added. If shp instanceof ArrayBuffer or shp instanceof Uint8Array, reads the specified in-memory shapefile. Otherwise, shp must be a Node readable stream in Node or a WhatWG standard readable stream in browsers.

If typeof dbf is “string”, opens the dBASE file at the specified dbf path. If dbf does not have a “.dbf” extension, it is implicitly added. If dbf instanceof ArrayBuffer or dbf instanceof Uint8Array, reads the specified in-memory dBASE file. If dbf is undefined and shp is a string, then dbf defaults to shp with the “.shp” extension replaced with “.dbf”; in this case, no error is thrown if there is no dBASE file at the resulting implied dbf. If dbf is undefined and shp is not a string, or if dbf is null, then no dBASE file is read, and the resulting GeoJSON features will have empty properties. Otherwise, dbf must be a Node readable stream in Node or a WhatWG standard readable stream in browsers.

If typeof shp or dbf is “string”, in Node, the files are read from the file system; in browsers, the files are read using streaming fetch, if available, and falling back to XMLHttpRequest. See path-source for more.

The follwing options are supported:

  • encoding - the dBASE character encoding; defaults to “windows-1252”
  • highWaterMark - in Node, the size of the stream’s internal buffer; defaults to 65536

# shapefile.openShp(shp[, options]) <>

Returns a promise that yields a GeoJSON geometry source. Unlike shapefile.open, this only reads the shapefile, and never the associated dBASE file. Subsequent calls to source.read will yield GeoJSON geometries.

If typeof shp is “string”, opens the shapefile at the specified shp path. If shp does not have a “.shp” extension, it is implicitly added. In Node, the files are read from the file system; in browsers, the files are read using streaming fetch, if available, and falling back to XMLHttpRequest. (See path-source for more.) If shp instanceof ArrayBuffer or shp instanceof Uint8Array, reads the specified in-memory shapefile. Otherwise, shp must be a Node readable stream in Node or a WhatWG standard readable stream in browsers.

The follwing options are supported:

  • highWaterMark - in Node, the size of the stream’s internal buffer; defaults to 65536

# shapefile.openDbf(dbf[, options]) <>

Returns a promise that yields a GeoJSON properties object source. Unlike shapefile.open, this only reads the dBASE file, and never the associated shapefile. Subsequent calls to source.read will yield GeoJSON properties objects.

If typeof dbf is “string”, opens the dBASE at the specified dbf path. If dbf does not have a “.dbf” extension, it is implicitly added. In Node, the files are read from the file system; in browsers, the files are read using streaming fetch, if available, and falling back to XMLHttpRequest. (See path-source for more.) If dbf instanceof ArrayBuffer or dbf instanceof Uint8Array, reads the specified in-memory shapefile. Otherwise, dbf must be a Node readable stream in Node or a WhatWG standard readable stream in browsers.

The follwing options are supported:

  • encoding - the dBASE character encoding; defaults to “windows-1252”
  • highWaterMark - in Node, the size of the stream’s internal buffer; defaults to 65536

Sources

Calling shapefile.open yields a source; you can then call source.read to read individual GeoJSON features. Similarly, shapefile.openShp yields a source of GeoJSON geometries, and shapefile.openDbf yields of a source of GeoJSON properties objects.

# source.bbox

The shapefile’s bounding box [xmin, ymin, xmax, ymax], where x and y represent longitude and latitude in spherical coordinates. This field is only defined on sources returned by shapefile.open and shapefile.openShp, not shapefile.openDbf.

# source.read() <>

Returns a Promise for the next record from the underlying stream. The yielded result is an object with the following properties:

  • value - a JSON object, or undefined if the stream ended
  • done - a boolean which is true if the stream ended

The type of JSON object depends on the type of source: it may be either a GeoJSON feature, a GeoJSON geometry, or a GeoJSON properties object (any JSON object).

# source.cancel() <>

Returns a Promise which is resolved when the underlying stream has been destroyed.

Command Line Reference

shp2json

# shp2json [options…] [file] <>

Converts the specified shapefile file to GeoJSON. If file is not specified, defaults to reading from stdin (with no dBASE file). For example, to convert to a feature collection:

shp2json example.shp

To convert to a geometry collection:

shp2json -g example.shp

To convert to newline-delimited features:

shp2json -n example.shp

To convert to newline-delimited geometries:

shp2json -ng example.shp

When --geometry or --ignore-properties is not used, the shapefile is joined to the dBASE table file (.dbf) file corresonding to the specified shapefile file, if any.

# shp2json -h
# shp2json --help

Output usage information.

# shp2json -V
# shp2json --version

Output the version number.

# shp2json -o file
# shp2json --out file

Specify the output file name. Defaults to “-” for stdout.

# shp2json -n
# shp2json --newline-delimited

Output newline-delimited JSON, with one feature or geometry per line.

# shp2json -g
# shp2json --geometry

Output a geometry collection instead of a feature collection or, in conjuction with --newline-delimited, geometries instead of feature objects. Implies --ignore-properties.

# shp2json --ignore-properties

Ignore the corresponding dBASE table file (.dbf), if any. Output features will have an empty properties object.

# shp2json --encoding encoding

Specify the dBASE table file character encoding. Defaults to “windows-1252”.

# shp2json --crs-name name

Specify the coordinate reference system name. This only applies when generating a feature collection; it is ignored when -n or -g is used. Per the GeoJSON specification, the name should be an OGC CRS URN such as urn:ogc:def:crs:OGC:1.3:CRS84. However, legacy identifiers such as EPSG:4326 may also be used.

This does not convert between coordinate reference systems! It merely outputs coordinate reference system metadata. This library does not support parsing coordinate reference system specifications (.prj).

dbf2json

# dbf2json [options…] [file] <>

Converts the specified dBASE file to JSON. If file is not specified, defaults to reading from stdin. For example:

dbf2json example.dbf

To convert to newline-delimited objects:

dbf2json -n example.dbf

# dbf2json -h
# dbf2json --help

Output usage information.

# dbf2json -V
# dbf2json --version

Output the version number.

# dbf2json -o file
# dbf2json --out file

Specify the output file name. Defaults to “-” for stdout.

# dbf2json -n
# dbf2json --newline-delimited

Output newline-delimited JSON, with one object per line.

# dbf2json --encoding encoding

Specify the input character encoding. Defaults to “windows-1252”.

shapefile's People

Contributors

deniscarriere avatar frogcat avatar jasondavies avatar mbostock avatar mhall2 avatar mhkeller avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

shapefile's Issues

Encoding issues

When trying to read a UTF-8 encoded .dbf, non-ASCII characters unfortunately get mangled. I used the usual suspects at https://github.com/interactivethings/swiss-maps as input.

>>> dbfcat shp/swiss-cantons/swiss-cantons.dbf
header { version: 3,
  date: Wed Jul 26 1995 00:00:00 GMT+0200 (CEST),
  count: 26,
  fields: 
   [ { name: 'NR', type: 'N', length: 5 },
     { name: 'ABKUERZUNG', type: 'C', length: 2 },
     { name: 'NAME', type: 'C', length: 80 },
     { name: 'COUNTRY', type: 'C', length: 20 } ] }
record [ 1, 'ZH', 'Z�rich', 'CH' ]
record [ 2, 'BE', 'Bern/Berne', 'CH' ]
record [ 3, 'LU', 'Luzern', 'CH' ]
record [ 4, 'UR', 'Uri', 'CH' ]
record [ 5, 'SZ', 'Schwyz', 'CH' ]
record [ 6, 'OW', 'Obwalden', 'CH' ]
record [ 7, 'NW', 'Nidwalden', 'CH' ]
record [ 8, 'GL', 'Glarus', 'CH' ]
record [ 9, 'ZG', 'Zug', 'CH' ]
record [ 10, 'FR', 'Fribourg', 'CH' ]
record [ 11, 'SO', 'Solothurn', 'CH' ]
record [ 12, 'BS', 'Basel-Stadt', 'CH' ]
record [ 13, 'BL', 'Basel-Landschaft', 'CH' ]
record [ 14, 'SH', 'Schaffhausen', 'CH' ]
record [ 15, 'AR', 'Appenzell Ausserrhoden', 'CH' ]
record [ 16, 'AI', 'Appenzell Innerrhoden', 'CH' ]
record [ 17, 'SG', 'St. Gallen', 'CH' ]
record [ 18, 'GR', 'Graub�nden/Grigioni', 'CH' ]
record [ 19, 'AG', 'Aargau', 'CH' ]
record [ 20, 'TG', 'Thurgau', 'CH' ]
record [ 21, 'TI', 'Ticino', 'CH' ]
record [ 22, 'VD', 'Vaud', 'CH' ]
record [ 23, 'VS', 'Valais/Wallis', 'CH' ]
record [ 24, 'NE', 'Neuch�tel', 'CH' ]
record [ 25, 'GE', 'Gen�ve', 'CH' ]
record [ 26, 'JU', 'Jura', 'CH' ]
end

GeoJson is malformed when using #openShp

After i have checked the code, it seems there is a problem in shp/read.js

function read() {
      var length = header.getInt32(4, false) * 2 - 4, type = header.getInt32(8, true);
      return length < 0 || (type && type !== that._type) ? skip() : that._source.slice(length).then(function(chunk) {
        return {done: false, value: type ? that._parse(view(concat(array.slice(8), chunk))) : null};
      });
    }

the returned object is

{done: , value: {
    type:
    coordinates:
}}

instead it should be

{done: , value: {
    type:
    geometry: {
        type:
        coordinates:
    }
}}

How can we use shapefile js in a pipe chain

Hey there! I'm wondering if is there a way to pipe the output from shapefile to another transform stream like:

const readableShape = shapefile.readStream('path/to/file.shp');

readableShape
   .pipe(myTransformStream)
   .pipe(myTransformStream2)
   .pipe(myWritableStream);

MultiGeometry Shape

Hi,
I have a problem. I'm building a web page that user uploads just the .shp file and I'm using your library to convert this shapefile to geojson.
The problem is that the .shp file come with 2 separated geometries and the geojson output just come with 1 of the 2 geometries.

Any idea to resolve this?

PolylineM/PolylineZ support

Any plans to add PolylineM/PolylineZ support. I wouldn't care if it didn't return the M or Z values. Just the shapes(X Y).

It would be easy to just add the type numbers, 13 and 23, to shp/index.js then call the readPolyLine function since all the M and Z values are after the Point X Y values.

Can’t fetch URLs that don’t have .shp/.dbf extensions.

Testing for .shp or .dbf at the end isn’t a good test for URLs. For example, it means you can’t pass a URL to an Observable FileAttachment (because those never have the .shp extension, and adding the .shp extension breaks the URL).

I think we should probably get rid of this magic. It might be okay to replace .shp with .dbf if the dbf argument is undefined, but otherwise I think we should leave the URLs as-is.

Reading from File / Blob?

Is there a simple way to turn a File (or Blob, I guess) from the browser into something that shapefile.open accepts? Right now I'm using

new Promise((resolve,reject) => {
    let reader = new FileReader();
    reader.onload = evt => resolve(reader.result);
    reader.readAsArrayBuffer(blob);
});

which works, but of course this loses the benefit of working stream-wise. It looks like WhatWG readable-streams support is pretty patchy, and focused on fetch rather than looking at local files.

If there is a better way to open from a local file (I get my File from <input type="file">), maybe an example in the docs would be helpful? If not, maybe it would be worthwhile to add Blob as a supported argument?

shp2json should ignore EPIPE errors on write.

For example:

$ shp2json -n example.shp | head -n2
{"type":"Feature","properties":{"FID":0},"geometry":{"type":"Point","coordinates":[1,2]}}
{"type":"Feature","properties":{"FID":1},"geometry":{"type":"Point","coordinates":[3,4]}}
events.js:160
      throw er; // Unhandled 'error' event
      ^

Error: write EPIPE
    at exports._errnoException (util.js:1012:11)
    at WriteWrap.afterWrite (net.js:793:14)

If stdout throws an EPIPE exception we should just swallow it silently and abort.

It does not support uppercase SHP extention

Hi,

I found that shape.open() method emits error for upper shapefile. For example:
path = "/user/TEST_002.SHP";

It emits error message:

​​​​​{ [Error: ENOENT: no such file or directory, open '/user/TEST_002.SHP.shp']​​​​​
​​​​​ errno: -2,​​​​​
​​​​​ code: 'ENOENT',​​​​​
​​​​​ syscall: 'open',​​​​​
​​​​​ path: '/user/TEST_002.SHP.shp' }​​​​​

Do you think It is a good idea to make the library support uppercase extension ?

Parsing dbf file returns unexpected null values

The .dbf file downloaded here returns null values for each district. See this notebook for an example.

Other parsers (such as the one in R) detect values:
Screen Shot 2022-02-07 at 4 21 13 PM

@Fil noted that:

the file is full of \x00 then fails on !(value = value.trim()) || isNaN(value = +value) ? null : value; in the readNumber function. I guess we should trim the \x00 not only spaces

Reproject output to EPSG:4326 by default.

I really love shp2json (and avoiding the massive dependency that is GDAL), but the fact it doesn't support reprojection is often a pain. (Particularly when you write a script that assumes that all your shapefiles will be EPSG:4326, then one arrives in a different projection and now you have to switch to OGR2OGR or something.)

Mike addressed the issue, saying " This is just a Shapefile parser, so it wouldn’t be the right place to implement coordinate system transformations."

I would like to argue against this :)

First, it's not just a Shapefile parser. shp2json is clearly a Shapefile->Geojson converter. And currently, it's producing GeoJSON that doesn't comply with the RFC. If a library or tool is producing GeoJSON as output, it should produce that GeoJSON in EPSG:4326 by default (and maybe allow retaining the original projection with some special flag).

Secondly, it definitely seems like the job of this library to parse the .prj file if there is one, at least.

Sure, it's possible to use workarounds to get around this limitation, but they're not convenient, particularly when you're dealing with a Shapefile whose projection you don't know ahead of time. That makes this kind of command line hard to write:

shp2json foo.shp | reproject --to=EPSG:4326 --from=...what?

(Supporting reprojection to arbitrary CRS's would definitely be beyond the scope of the library, otoh.)

Support --ignore-geometry

This might sound odd, but it would actually be helpful to have a command line option --ignore-geometry which does not include geometries in the output. I'm finding that relatively often I want to examine the contents of some spatial files (eg, statistical boundaries) and all the geometry makes it very inconvenient: huge files, hard to load in a text editor, hard to see the properties I care about.

The downside is: either the output is malformed GeoJSON (containing either no geometry attribute, or a blank one), or it's not GeoJSON at all (in which case, what?) OTOH, the tool is called shp2json, not shp2geojson so maybe that's not disastrous. :)

Support feature count

This works really well, thank you! It would be helpful to have a way of knowing the total feature count, in order to provide percentages to the user when processing files.

I don't know much about how Shapefiles work, but ogrinfo can provide this info instantly, so presumably it's stored in metadata and doesn't require manuallly counting.

Error: error in unzip: code 9

I'm trying to convert a file with shp2json, and getting an error:

$ curl -o street-tree-data-wgs84.zip https://ckan0.cf.opendata.inter.prod-toronto.ca/dataset/6ac4569e-fd37-4cbc-ac63-db3624c5f6a2/resource/c1229af1-8ab6-4c71-b131-8be12da59c8e/download/street-tree-data-wgs84.zip
$ unzip street-tree-data-wgs84.zip
Archive:  street-tree-data-wgs84.zip
  inflating: TMMS_Open_Data_WGS84.shp.xml
  inflating: TMMS_Open_Data_WGS84.shx
 extracting: TMMS_Open_Data_WGS84.cpg
  inflating: TMMS_Open_Data_WGS84.dbf
  inflating: TMMS_Open_Data_WGS84.prj
  inflating: TMMS_Open_Data_WGS84.sbn
  inflating: TMMS_Open_Data_WGS84.sbx
  inflating: TMMS_Open_Data_WGS84.shp
$ shp2json TMMS_Open_Data_WGS84.shp -o trees.json
Error: error in unzip: code 9

The file in question is here.

It's possible this is me doing something wrong, as I don't know much about SHP files (which is why I was trying to convert). I've filed in case it's a real bug.

shp2json error in node v4.2.6

I see you have an example that use exactly what i want to have:
https://gist.github.com/mbostock/5557726

My node version:

node --version

v4.2.6

shp2json --version

0.6.4

v4.2.6 and 0.6.4 are the default for the ubuntu LTS realease.

lsb_release -a

No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 16.04.3 LTS
Release: 16.04
Codename: xenial

Here what i do:

curl -o estados.zip http://mapserver.inegi.org.mx/MGN/mge2010v5_0.zip
curl -o  municipios.zip http://mapserver.inegi.org.mx/MGN/mgm2010v5_0.zip
unzip estados.zip 
unzip municipios.zip
ogr2ogr states.shp Entidades_2010_5.shp -t_srs "+proj=longlat +ellps=WGS84 +no_defs +towgs84=0,0,0"
ogr2ogr municipalities.shp Municipios_2010_5.shp -t_srs "+proj=longlat +ellps=WGS84 +no_defs +towgs84=0,0,0"

shp2json fail in both cases:

shp2json states.shp -o states2.json

(node) Buffer.set is deprecated. Use array indexes instead.
error: Invalid array length

shp2json municipalities.shp -o municipalities2.json

(node) Buffer.set is deprecated. Use array indexes instead.
error: Offset is outside the bounds of the DataView

Thanks.

Tests are broken since 0.5.

ran into the error below when running npm test

> [email protected] test /Users/john6251/github/shapefile
> tape 'test/**/*-test.js'

/Users/john6251/github/shapefile/dbf/index.js:1
(function (exports, require, module, __filename, __dirname) { import slice from "slice-source";
                                                              ^^^^^^
SyntaxError: Unexpected reserved word

tape doesn't appear to be happy being pointed straight at the ES6 source (at least on my box), but since i don't have any trouble running d3 tests locally that follow the same pattern, i'm stumped as to what i could have misconfigured.

Feed back - A build pain

Hello there !

First of all, thank for this package !

But... I was unable to build correctly the shapefile.js and shape.min.js
I try to build it using npm run prepublishOnly and i got some errors.

The first that i encountered is in the script pretest. the rm command is linux only ( sorry i'm under windows like 90% of people in the world 😩 ) maybe should you use a remover like del-cli in conjunction with make-dir-cli for the folder creation. They are cross-platform.

The second was the command cp which is linux too... Maybe the usage of cp-cli could help here too (cross-platform of course).

The third... tail same problem, maybe the same solution !

Then, the $preamble is not processed, so i got a $(preamble) in shapefile.js on top...

Fiouu...

Lot of stuff that you do in prepublishOnly script could be integrated inside the rollup.config like uglifyjs part.

The only thing that i was unable to fix is the tail part for build test file. Which, IMO, could be bundled by rollup entirely with an other config file.

If you need some help to make this package more cross-platform, i could give you a little PR about.

Best regards,
Itee

Support for browser parsing?

It seems like the current implementation is dependent on node.js. Do you have any thoughts on providing support for a pure client side parser?

I thought about this a tiny bit and it seems like one could use the new TypedArrays in Javascript to do this. The DataView object allows for one to specify the endianness of the byte array, which is something we need for reading shapefiles.

http://www.khronos.org/registry/typedarray/specs/latest/#8
Perhaps if the parsing of the features were moved into stream independent functions we could dynamically support both a stream client for node and one for the browser.

Or maybe this is out of scope for this project so feel free to close if so.
Thanks.

Null feature geomotry

when running a batch of shape files, sometimes I will get a null for geometry, which according to the spec is allowed. In a collection of polygons this will end processing once the null is hit and throw the 'Encountered unknown shape type (0)' message

NodeJS 18 Error Reading Files

I have seen this with several projects reading array buffers and streams that are failing. Seems to be an issue with the underlying stream-source library but can't be 100% sure.

Error is:

Error: unsupported shape type: 1120127468
    at new Shp (/home/ec2-user/Learning/shapefile/dist/shapefile.node.js:245:33)
    at /home/ec2-user/Learning/shapefile/dist/shapefile.node.js:239:12
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)

Coordinate Reference System

The coordinate reference systems should be included at least in the read GeoJSON.

http://geojson.org/geojson-spec.html#coordinate-reference-system-objects

shapefile.read('example.shp')
    .then(data => {
        data.type
        data.bbox
        data.features
        data.crs //=> undefined
    })

There's going to be a .prj file associated with the shapefile files, might want to parse that and place that information into the GeoJSON crs. The projection file is a readable string.

GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137,298.257223563]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]]

Proj4 does a good job at parsing these CRS codes:

https://github.com/proj4js/proj4js/blob/master/lib/parseCode.js

Differing results between shapefile and mapshaper conversion

I was loosely certain that #59 was the same issue I'm seeing, but now I'm not so sure and wanted to open a new issue just in case.

A link to the source shapefile: wsp_120hr5km_latest.zip

The shapefile in question is one of the NHC's wind speed probabilities releases. It contains multiple concentric polygons that represent the different probabilities of winds. When I convert this shapefile with mapshaper (via the CLI, but also same result in the browser interface), it seems to come over correctly:

image
Output GeoJSON file is here: https://gist.github.com/rdmurphy/d283421b5b9cfa42f1fe29e390ce9230

But when I pass this same shapefile through shapefile, I get this:

image
GeoJSON here: https://gist.github.com/rdmurphy/3a44e9be9736ec0364df6a926571b50a

Is this the same issue as #59, or something else?

And to be clear — totally understand if this is an accepted difference with how shapefile works! Just wasn't sure if perhaps I've surfaced a different issue.

Thank you!

Add support for unprojection.

Some shapefiles require conversion to EPSG:4326. It would be nice to detect these and convert the coordinates automatically from the input projection.

shapefile RangeError: Offset is outside the bounds of the DataView

I have eight concurrent nodejs processes with PM2 running each an app with this code:

const shapefile = require('shapefile')
// ....
// ....
shapefile.read(
  path.join(__dirname, 'res', value.unzippedFilenamesWithoutExtension + '.shp'),
  path.join(__dirname, 'res', value.unzippedFilenamesWithoutExtension + '.dbf'),
  { encoding: 'utf-8' }
).then(geojson => {
  regions[key].geojson = geojson
  console.log(
    `Shapefiles read from ${colors.cyan(value.unzippedFilenamesWithoutExtension + '.shp')} ` +
    `and from ${colors.cyan(value.unzippedFilenamesWithoutExtension + '.dbf')}`
  )
  callback()
}).catch((error) => {
  console.error('Error reading shapefile', error.stack)
  callback(Error(error))
})

and this error is throwing ONLY at 2 of those 8 processes, it seems something colides on reading the files

/home/jfolpf/.pm2/logs/geoptapi-error-8.log last 15 lines:
8|geoptapi |     at wrapper (/var/www/geoptapi/node_modules/async/dist/async.js:268:20)
8|geoptapi |     at iterateeCallback (/var/www/geoptapi/node_modules/async/dist/async.js:413:21)
8|geoptapi |     at /var/www/geoptapi/node_modules/async/dist/async.js:321:20
8|geoptapi |     at /var/www/geoptapi/prepareServer.js:110:7
8|geoptapi | 10-07-2021: Error: Error: Error: Error: RangeError: Offset is outside the bounds of the DataView
8|geoptapi |     at /var/www/geoptapi/server.js:27:16
8|geoptapi |     at /var/www/geoptapi/prepareServer.js:18:9
8|geoptapi |     at /var/www/geoptapi/node_modules/async/dist/async.js:2955:19
8|geoptapi |     at wrapper (/var/www/geoptapi/node_modules/async/dist/async.js:268:20)
8|geoptapi |     at iterateeCallback (/var/www/geoptapi/node_modules/async/dist/async.js:413:21)
8|geoptapi |     at /var/www/geoptapi/node_modules/async/dist/async.js:321:20
8|geoptapi |     at /var/www/geoptapi/node_modules/async/dist/async.js:2953:17
8|geoptapi |     at /var/www/geoptapi/prepareServer.js:114:7
8|geoptapi |     at wrapper (/var/www/geoptapi/node_modules/async/dist/async.js:268:20)
8|geoptapi |     at iterateeCallback (/var/www/geoptapi/node_modules/async/dist/async.js:413:21)

do you have an idea of what might be?

Holes stick to outside area are not read accordingly

Hi,
Thanks for making a good library. I found it does not read holes that stick to outside of its outer rings properly even though the coordinates are in CCW order. Instead, It will read the holes as normal overlapped polygons. Please refer to the screenshot.
Screenshot from 2019-03-27 14-39-42

Shouldn't we need to check the holes must be fully within theirs outer rings?
Thank you.

Using with React native.

Hi. I need to import shapefiles in my RN app and plot data. I've tried to install this lib and use it but it complains about fs module dependency not found. The fs package does not work on RN (there is a react-native-fs)... I'm a little lost with this. If anyone could help me with this I would appreciate.

Support retrieving z values

Hi @mbostock

Following on from #34 I was wondering if you'd be interested in receiving a pull request that provided full support for returning z values from a shapefile?

The GeoJSON spec supports having a z value and if the data is there in the shapefile it would be nice to pass it through to the geojson.

If you're happy to receive the pull request I'm happy to do the work on it :)

Cheers
Rowan

How to identify nested polygons or holes within the same block?

Hi,

It is a good library. When I try to read a shape file contains multiple and nested polygons, I got the result a geojson list of equal 1 level. How to identify which one is the hole of which one? how to know the ones within the same block/record?

shp.open(filePath)
  .then(source => source.read()
    .then(function log(result) {
      if (result.done) return;
      console.log(result.value);
      return source.read().then(log);
    }))
  .catch(error => console.error(error.stack));

Results:
[
{ type: 'Feature', ...}
{ type: 'Feature', ...}
...
]

Thanks a lot.
John

unable to convert shp to topojson and plot in d3

I've read through the command line cartography posts am still puzzled by this.

My shapefiles of regions in BC are here https://github.com/Mbrownshoes/shp_example

to convert and simplify I'm using the following command

geo2topo \
  regions=<(shp2json ABMS_RD_polygon.shp) \
  | toposimplify -P 0.1 \
  | topoquantize 1e5 \
  > bc.json

When I plot with d3 I'm using this code, which works for other topojson files you have posted, such as the one in this example (among others)
https://bl.ocks.org/mbostock/5562380/825c2889f93296bcb78e02a1f85a0981554c1ea7
however, I'm not seeing anything plotted. I do see a path when I inspect.

My resulting topojson file has a bbox and well as a transform at the start of the file:
{"type":"Topology","bbox":[273366.8019999984,359771.84100000095,1870586.801,1735720.873999999],"transform":{"scale":[15.972359713597154,13.759627926279242],"translate":[273366.8019999984,359771.84100000095]}

If I look at my topojson file in maphapper it looks fine.

I'm puzzled by why this isn't working. Here's the d3 code to plot.

var svg = d3.select("svg"),
    width = +svg.attr("width"),
    height = +svg.attr("height");

      // const projection = d3.geoAlbers()
      // .rotate([126, -10])
      // .center([7,44])
      // .parallels([50, 58])
      // .scale(1970)
      // .translate([960 / 2, 600 / 2]);
  
  const path = d3.geoPath()
    // .projection(projection)


d3.queue()
    .defer(d3.json,"mx.json")
    .await(ready);

function ready(error, mx) {
  if (error) throw error;
  console.log(mx)



  var color = d3.scaleThreshold();

      svg.append("path")
        .datum(topojson.mesh(mx, mx.objects.regions, (a, b) => a !== b))
        .attr("fill", "none")
        .attr("stroke", "black")
        .attr("stroke-linejoin", "round")
        .attr("d", path); 
}

Samples?

I'm having trouble getting the streaming parser implemented. Are there any samples around? Here's my test code. It never reaches: console.log('HERE!');
TIA

var shapefile = require('shapefile');
var shpfile = "shpfile";
reader = shapefile.reader(shpfile);
reader.readHeader(function(err, header){
    console.log('HERE!');
    if (err) console.log('error: %s', err);
    else {
        reader.readRecord(function(error, rec){
            if (error) console.log('error: %s', error);
            console.log(rec);
        });
    }
});
reader.close();

Consistent error when running shp2json from the CLI

Everytime I run shp2json from the command line, I get the error

error: Decoder not present. Did you forget to include encoding-indexes.js first?

I've used shp2json with success before and this error only started occurring after a full reinstall of OSX. It happens on node 6.9.3 as well as 7.3 and 7.4.

Please let me know if there's any other information useful for debugging this.

Numeric column coming in as NULL property

Hi! We think we have found an edge case. We have a file that has a numeric column "Blend - *", that when we run it through shapefile, returns all NULL values in the property field instead of the true values. Values can be properly read by other software such as mapshaper or our C++ program, but not this library.

The shape points get converted correctly, and the property "Blend - *" appears on the result object, just not the associated property values.

Code for testing:

async function parseShapefile(absPath) {
    const geoJson = await shapefile.read(absPath)
    if (!geoJson) { return {} }
    // Find numeric columns to use for rate selection
    for (const property of Object.keys(shape.properties)) {
        let numeric = true
        for (const feature of geoJson.features) {
            // Use both global isNaN() and parseFloat() to determine if we have a numeric column.
            // isNaN() catches most non-numeric columns, but not null
            // parseFloat(null) == NaN, but a column value such as "32 UAN" will return a number (32)
            if (isNaN(feature.properties[property]) || Number.isNaN(parseFloat(feature.properties[property]))) {
                numeric = false
            }
            else {
                console.log('Found Number')
            }
        }
        if (numeric) {
             console.log(`Numeric Column: ${property}`)
        }
    }
}

File in question:
TID_822919_OID_2363808_Raven_Viper_2021-04-26-03-53-54.zip

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.