Giter Site home page Giter Site logo

jsonstream's Introduction

JSONStream

streaming JSON.parse and stringify

install

npm install JSONStream

example

var request = require('request')
  , JSONStream = require('JSONStream')
  , es = require('event-stream')

request({url: 'http://isaacs.couchone.com/registry/_all_docs'})
  .pipe(JSONStream.parse('rows.*'))
  .pipe(es.mapSync(function (data) {
    console.error(data)
    return data
  }))

JSONStream.parse(path)

parse stream of values that match a path

  JSONStream.parse('rows.*.doc')

The .. operator is the recursive descent operator from JSONPath, which will match a child at any depth (see examples below).

If your keys have keys that include . or * etc, use an array instead. ['row', true, /^doc/].

If you use an array, RegExps, booleans, and/or functions. The .. operator is also available in array representation, using {recurse: true}. any object that matches the path will be emitted as 'data' (and piped down stream)

If path is empty or null, no 'data' events are emitted.

If you want to have keys emitted, you can prefix your * operator with $: obj.$* - in this case the data passed to the stream is an object with a key holding the key and a value property holding the data.

Examples

query a couchdb view:

curl -sS localhost:5984/tests/_all_docs&include_docs=true

you will get something like this:

{"total_rows":129,"offset":0,"rows":[
  { "id":"change1_0.6995461115147918"
  , "key":"change1_0.6995461115147918"
  , "value":{"rev":"1-e240bae28c7bb3667f02760f6398d508"}
  , "doc":{
      "_id":  "change1_0.6995461115147918"
    , "_rev": "1-e240bae28c7bb3667f02760f6398d508","hello":1}
  },
  { "id":"change2_0.6995461115147918"
  , "key":"change2_0.6995461115147918"
  , "value":{"rev":"1-13677d36b98c0c075145bb8975105153"}
  , "doc":{
      "_id":"change2_0.6995461115147918"
    , "_rev":"1-13677d36b98c0c075145bb8975105153"
    , "hello":2
    }
  },
]}

we are probably most interested in the rows.*.doc

create a Stream that parses the documents from the feed like this:

var stream = JSONStream.parse(['rows', true, 'doc']) //rows, ANYTHING, doc

stream.on('data', function(data) {
  console.log('received:', data);
});
//emits anything from _before_ the first match
stream.on('header', function (data) {
  console.log('header:', data) // => {"total_rows":129,"offset":0}
})

awesome!

In case you wanted the contents the doc emitted:

var stream = JSONStream.parse(['rows', true, 'doc', {emitKey: true}]) //rows, ANYTHING, doc, items in docs with keys

stream.on('data', function(data) {
  console.log('key:', data.key);
  console.log('value:', data.value);
});

You can also emit the path:

var stream = JSONStream.parse(['rows', true, 'doc', {emitPath: true}]) //rows, ANYTHING, doc, items in docs with keys

stream.on('data', function(data) {
  console.log('path:', data.path);
  console.log('value:', data.value);
});

recursive patterns (..)

JSONStream.parse('docs..value') (or JSONStream.parse(['docs', {recurse: true}, 'value']) using an array) will emit every value object that is a child, grand-child, etc. of the docs object. In this example, it will match exactly 5 times at various depth levels, emitting 0, 1, 2, 3 and 4 as results.

{
  "total": 5,
  "docs": [
    {
      "key": {
        "value": 0,
        "some": "property"
      }
    },
    {"value": 1},
    {"value": 2},
    {"blbl": [{}, {"a":0, "b":1, "value":3}, 10]},
    {"value": 4}
  ]
}

JSONStream.parse(pattern, map)

provide a function that can be used to map or filter the json output. map is passed the value at that node of the pattern, if map return non-nullish (anything but null or undefined) that value will be emitted in the stream. If it returns a nullish value, nothing will be emitted.

JSONStream also emits 'header' and 'footer' events, the 'header' event contains anything in the output that was before the first match, and the 'footer', is anything after the last match.

JSONStream.stringify(open, sep, close)

Create a writable stream.

you may pass in custom open, close, and seperator strings. But, by default, JSONStream.stringify() will create an array, (with default options open='[\n', sep='\n,\n', close='\n]\n')

If you call JSONStream.stringify(false) the elements will only be seperated by a newline.

If you only write one item this will be valid JSON.

If you write many items, you can use a RegExp to split it into valid chunks.

JSONStream.stringifyObject(open, sep, close)

Very much like JSONStream.stringify, but creates a writable stream for objects instead of arrays.

Accordingly, open='{\n', sep='\n,\n', close='\n}\n'.

When you .write() to the stream you must supply an array with [ key, data ] as the first argument.

unix tool

query npm to see all the modules that browserify has ever depended on.

curl https://registry.npmjs.org/browserify | JSONStream 'versions.*.dependencies'

numbers

numbers will be emitted as numbers. huge numbers that cannot be represented in memory as javascript numbers will be emitted as strings. cf https://github.com/creationix/jsonparse/commit/044b268f01c4b8f97fb936fc85d3bcfba179e5bb for details.

Acknowlegements

this module depends on https://github.com/creationix/jsonparse by Tim Caswell and also thanks to Florent Jaby for teaching me about parsing with: https://github.com/Floby/node-json-streams

license

Dual-licensed under the MIT License or the Apache License, version 2.0

jsonstream's People

Contributors

agamm avatar alsotang avatar bzoz avatar daern91 avatar dominictarr avatar doowb avatar eugendueck avatar floby avatar galniv avatar jeromew avatar joscha avatar jpage-godaddy avatar kemitchell avatar lbdremy avatar lpinca avatar mafintosh avatar mantoni avatar max-mapper avatar mfogel avatar nathanwills avatar notslang avatar robotnic avatar rstacruz avatar santigimeno avatar skenqbx avatar stephenlacy avatar urish avatar vojtatranta avatar wpears avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jsonstream's Issues

support alternate JSON formats

it would be very helpful to provide a more forgiving alternate parser, such as allowing JSON5 / object literals. is this feasible with your current implementation?

p.s. - thank you for this project. very cool!

object mode?

Is this plugin compatible with streams in object mode? I'm working on a Gulp plugin that parses and transforms a series of JSON files (palantir/gulp-bower-overrides) and this plugin looks like a promising way to process the JSON but I can't figure out how to make it work with Vinyl files.

through2 supports object mode via through.obj, but the classic through module used here does not seem to. Perhaps that's part of the issue?

Not able to parse the json file

Not able to parse the json file [I am new to javascript]

This is my json file location = './jsons/rep_controller.json':
{
"apiVersion": "v12",
"kind": "SController",
"id": "siddhweb-controller",
"labels": {
"name": "siddhee"
}
}

function loadJSONData(next) {
  logger.info('loadJSONData');
  var getStream = function () {
    var jsonData = './jsons/rep_controller.json',
        stream = fs.createReadStream(jsonData, {encoding: 'utf8'}),
        parser = JSONStream.parse('*');
        return stream.pipe(parser);
    };

getStream().on('data', function(data) {
 logger.info('received: - ', data);
});
};

Getting the error undefined

Add a license

Hey -- this library looks great, and I was wondering what the license on it is (if any)?

stringify() writes separator on the front of the next chunk

Hey,

Is there a particular reason why stringify writes the separator in front of the chunks instead of in the end as the chunks are emitted? link

if(first) { first = false ; stream.queue(op + json)}
else stream.queue(sep + json)
                  ^^^

I'm asking because I have an issue where if I pipe stringified JSON to bunyan-cli, it won't process the latest line of JSON until it gets the new line separator from the next chunk. Apparently it doesn't consider the line of JSON complete until it gets the line break.

-    if(first) { first = false ; stream.queue(op + json)}
-    else stream.queue(sep + json)
+    if(first) { first = false ; stream.queue(op + json + sep)}
+    else stream.queue(json + sep)

Changing the code like this fixes the issue but might break other things. Any thoughts?

stringify.write() does not invoke the callback supplied

According to the node docs, Writable.write() should take an optional callback for when the data has been written to the stream. This callback never gets invoked on JSONStream.

var JSONStream = require('JSONStream');
var outgoing = JSONStream.stringify();
outgoing.pipe(process.stdout);

setTimeout(function () {console.log('timeout')}, 10000);

console.log('sending');
outgoing.write(['foo'], function () {
    console.log('sent');
    process.exit();
});

Output:

sending
[
["foo"]timeout

Browserify throws an error because of #! /usr/bin/env node

The first line of index.js, #! /usr/bin/env node, causes browserify to throw an error when using brfs.

C:\Users\Michael\Github\javascript\convert-and-seed-audio>browserify node_modules/JSONStream -t brfs -o deleteme.js
SyntaxError: Unexpected character '#' (1:0)
    at Parser.pp.raise (C:\Users\Michael\Github\javascript\convert-and-seed-audio\node_modules\brfs\node_modules\static-module\node_modules\falafel\node_modules\acorn\dist\acorn.js:1745:13)
    at Parser.pp.getTokenFromCode (C:\Users\Michael\Github\javascript\convert-and-seed-audio\node_modules\brfs\node_modules\static-module\node_modules\falafel\node_modules\acorn\dist\acorn.js:3486:8)
    at Parser.pp.readToken (C:\Users\Michael\Github\javascript\convert-and-seed-audio\node_modules\brfs\node_modules\static-module\node_modules\falafel\node_modules\acorn\dist\acorn.js:3189:15)
    at Parser.pp.nextToken (C:\Users\Michael\Github\javascript\convert-and-seed-audio\node_modules\brfs\node_modules\static-module\node_modules\falafel\node_modules\acorn\dist\acorn.js:3181:71)
    at parse (C:\Users\Michael\Github\javascript\convert-and-seed-audio\node_modules\brfs\node_modules\static-module\node_modules\falafel\node_modules\acorn\dist\acorn.js:100:5)
    at module.exports (C:\Users\Michael\Github\javascript\convert-and-seed-audio\node_modules\brfs\node_modules\static-module\node_modules\falafel\index.js:22:15)
    at C:\Users\Michael\Github\javascript\convert-and-seed-audio\node_modules\brfs\node_modules\static-module\index.js:37:23
    at ConcatStream.<anonymous> (C:\Users\Michael\Github\javascript\convert-and-seed-audio\node_modules\brfs\node_modules\static-module\node_modules\concat-stream\index.js:36:43)
    at ConcatStream.emit (events.js:129:20)
    at finishMaybe (C:\Users\Michael\Github\javascript\convert-and-seed-audio\node_modules\brfs\node_modules\static-module\node_modules\concat-stream\node_modules\readable-stream\lib\_stream_writable.js:460:14)

C:\Users\Michael\Github\javascript\convert-and-seed-audio>browserify node_modules/JSONStream -o deleteme.js

C:\Users\Michael\Github\javascript\convert-and-seed-audio>

Moving the CLI to a separate file, or just deleting the line should fix the issue.

parsing array data

Hi,
In my usecase, the request data arrives as an array of objects. Because of this i cannot define a pattern. Is it possible to parse the stream on the fly and get one item at a time, so that i don't go high on memory foot print.

Thanks.

Feature: make JSON.stringify call configurable

Maybe I am too picky about the formatting of the stringified JSON, but i really love to customize it for better human readability

e.g.

JSON.stringify object, null , 2

which returns

{
  "user_id": 1,
  "filename": "2011_08_29_21_47_32.pdf",
  "tmp_filename": "/tmp/uploads/7e588b748835d9d4e385c0c000ef9edd.pdf",
  "size": 3511730,
  "sha": null,
  "id": 133
}

maybe you can make it configurable?

see: https://github.com/dominictarr/JSONStream/blob/master/index.js#L100

Getting value of the token

In the example path of rows.*.doc is it currently possible to get the value of the parameterized value?

We have a JSON object representing our users table where the key in the object (the "star" in the query) is the user's ID.

We are trying to parse through and extract the user ids of users who meet certain criteria but, while we're able to determine which users match our criteria, we haven't been able to figure out which user corresponds to which data.

Example:

{
    users: {
        123: {
            foo: 'bar'
        },
        456: {
            bar: 'foo'
        }
    }
}

We stream it via users.* which calls our callback correctly, but how do we figure out if the callback is associated with user 123 or 456?

npmignore

Add please tests to npm ignore for smaller package.

No 'error' event

Hi,

This small unit test show that JSONStream don't return an "error" event :

var assert = require('assert');
var JSONStream = require('JSONStream');
var stream = require('stream');
var es = require('event-stream');

describe('JSONStream', function() {

  it('With unvalid stream', function(done) {

        var s = new stream.Readable();
        s._read = function noop() {};
        s.push('["foo":bar[');
        s.push(null);

        var parser = JSONStream.parse('*');
        parser
        .on('error', assert.ifError)
        .on('end', done);
        s.pipe(parser);
  });
});

returns me :

mocha --recursive

  JSONStream
    1) With unvalid stream

  0 passing (11ms)
  1 failing

  1) JSONStream With unvalid stream:
     Uncaught Error: Unexpected COLON(":") in state COMMA
      at Parser.proto.parseError (/home/gchauvet/Documents/jsonstreamtest/node_modules/JSONStream/node_modules/jsonparse/jsonparse.js:309:16)
      at Parser.proto.onToken [as _onToken] (/home/gchauvet/Documents/jsonstreamtest/node_modules/JSONStream/node_modules/jsonparse/jsonparse.js:394:12)
      at Parser.parser.onToken (/home/gchauvet/Documents/jsonstreamtest/node_modules/JSONStream/index.js:87:12)
      at Parser.proto.write (/home/gchauvet/Documents/jsonstreamtest/node_modules/JSONStream/node_modules/jsonparse/jsonparse.js:99:34)
      at Stream.<anonymous> (/home/gchauvet/Documents/jsonstreamtest/node_modules/JSONStream/index.js:21:12)
      at Stream.stream.write (/home/gchauvet/Documents/jsonstreamtest/node_modules/JSONStream/node_modules/through/index.js:26:11)
      at write (_stream_readable.js:583:24)
      at flow (_stream_readable.js:592:7)
      at _stream_readable.js:560:7
      at process._tickCallback (node.js:415:13)

options object that can configrue matching parse and stringify streams.

we'll want to use this for a streaming api.

it would be really neat to be able to pass the opening, closing and separating strings in as a header or querystring.

then, users could specify what ever format was most convenient for them to parse.

of course, the default is valid json!

how to parse selected values from json?

I have the following code:

request({url: 'https://myurl.com/stream?method=json'})
    .pipe(JSONStream.parse('*'))     
    .pipe(es.mapSync(function (data) {
      console.log(data);
      var var1 = JSON.stringify(data);
      io.emit('notification', var1);
    }))

which works perfect for receiving ALL data from the json stream or when I change

    .pipe(JSONStream.parse('*')) 

to

.pipe(JSONStream.parse('Name')) 

to get only the name.

However what do I need to do in order to get

Name, Address and ZIP from the json stream? I could nowhere find the answer to this.

The JSON looks like this:

{"Date":"2015-03-16T13:00:12.860630336Z","Name":"Peter","Address":"Demostreet","ZIP":"1234"}

emits wrong data sometimes (i think)

here's a test case. i would expect emptyArray to never emit but it instead emits {"docs":[]}

var JSONStream = require('jsonstream')
var stream = require('stream')

var parser1 = JSONStream.parse(['docs', /./])
parser1.on('data', function(data) {
  console.log(data)
})

var parser2 = JSONStream.parse(['docs', /./])
parser2.on('data', function(data) {
  console.log(data)
})

function makeReadableStream() {
  var readStream = new stream.Stream()
  readStream.readable = true
  readStream.write = function (data) { this.emit('data', data) }
  readStream.end = function (data) { this.emit('end') }
  return readStream
}

var emptyArray = makeReadableStream()
emptyArray.pipe(parser1)
emptyArray.write('{"docs":[]}')
emptyArray.end()

var objectArray = makeReadableStream()
objectArray.pipe(parser2)
objectArray.write('{"docs":[{"hello":"world"}]}')
objectArray.end()

Parsing JSON without object at root

I have an API endpoint that accepts JSON bodies that are not JSON objects. JSONStream doesn't emit a root event in the scenario of, say, a stream containing true.

I am investigating a fix and can provide a PR, but would you be opposed to a JSON stream emitting any type of JSON value?

Callback when done.

I'm using the JSONStream to parse a very large JSON file and index it into ElasticSearch.
What I want to do, is know when all rows have been passed.

In short, I'm doing: parser = JSONStream.parse('*');

I'm piping the data through your other library (event stream) as follows:

parser.pipe(es.mapSync(function (data) { }).

Then, for each document traversed, I'm indexing that document to ElasticSearch.

Now: I want to know the number of rows inserted. How do I do this pragmatically?

thanks in advance.

Incoming data from the stream isn't fully parsed

The following snippet:

var request = require('request')
  , JSONStream = require('JSONStream')
  , es = require('event-stream')

var parser = JSONStream.parse(['rows', true])
    req = request({url: 'http://127.0.0.1:10355/db/_all_docs?include_docs=true'}),
    logger = es.mapSync(function (data) {
        console.log(data)
        console.log('*********************************');
        return data  
    });

req.pipe(parser)
parser.pipe(logger)

Returns data in the following format:

*********************************
{ id: '_design/test',
  key: '_design/test',
  value: { rev: '1-18fca9bd97ed51926f4fe9030d81b6c2' },
  doc: 
   { _id: '_design/test',
     _rev: '1-18fca9bd97ed51926f4fe9030d81b6c2',
     views: 
      { 'test_results-all_devices-for-test-for-integration': [Object],
        'tests-for-integration-by-device': [Object],
        'by-path': [Object],
        'all-tests': [Object] },
     language: 'javascript' } }
*********************************

Notice the [Object] references. How can we read fully parsed objects from the stream? Thanks!

Returning a single object

Hello,

If my JSON body is just a single object, and that's what I want to get out of the data event, what's the right expression? I tried "$", but that didn't work for me.

Also, I read in #47 that there's a root event, but I didn't get anything out of that, is the event available?

Error Handling?

@dominictarr
Hi, how to handle if there's no pattern match, e.g.

var got = require('./');
var json = require('JSONStream')

got('http://registry.npmjs.org/koa-better-body/latest')
    .on('error', function(err) {
        console.log('not found in registry', err)
    })
    .pipe(json.parse('engines')) //.pipe(json.parse('homepage')) will show homepage
    .on('error', function(err) {
        console.log('parse error', err)
    })
    .pipe(process.stdout)

There no have engines object in package.json, but how to handle it? Above not work.

Unhandled stream error in pipe

The code:

var request = http.get(someURL, function(response) {
            response.pipe(JSONStream.parse(...

The call stack:

stream.js:94
      throw er; // Unhandled stream error in pipe.
            ^
Error: Unexpected "!" at position 1 in state START
    at Parser.proto.charError (...\node_modules\JSONStream
\node_modules\jsonparse\jsonparse.js:84:16)
    at Parser.proto.write (...\node_modules\JSONStream\nod
e_modules\jsonparse\jsonparse.js:112:23)
    at Stream.<anonymous> (...\node_modules\JSONStream\ind
ex.js:21:12)
    at Stream.stream.write (...\node_modules\JSONStream\no
de_modules\through\index.js:26:11)
    at write (_stream_readable.js:602:24)
    at flow (_stream_readable.js:611:7)
    at _stream_readable.js:579:7
    at process._tickCallback (node.js:442:13)

What am I doing wrong?

memory leak

I have not been able to figure out where exactly in the chain I get this memory leak, but whenever I pipe request's JSONStream from couchdb _all_docs (about 85k docs), it eats up my memory like crazy. Any one else experience this? From my limited experience creating custom stream objects, the memory should stay fairly consistent as it is parsing I think?

Example code:

var down = JSONStream.parse(["rows", true, "doc"])
request.pipe(down)

How to run it ?

Hi,
I just found your project, downloaded it, but don't know how to run it .
Thanks

How to parse json from array

Pardon the newbie question.

I receive data from a via a third party api

It comes back in the form

{ asOfDate:xxx,
   loans:[
        { a:x,
          b:y,
          …
       },
        { a:xx,
          b:yy,
         …
       },
      …
     ]
}

My goal is to steam each element of the array
and process it as it comes in rather than building
the entire object and iterating over each element
in the array.

Can JSON stream help with this or do you have other suggestions.

FWiW each row goes into a mongo collection.

FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory

What the subject says. It happens when parsing big couchdb dumps and doing essentially nothing, for example:

var request = require('request')
var JSONStream = require('JSONStream')
var es = require('event-stream')

var parser = JSONStream.parse(['rows', true])
var req = request({url: 'http://some-couchdb-with-lot-of-data/poms/_all_docs?include_docs=true'})
var logger = es.mapSync(function (data) {
        console.log(data.id)
        return data
    })

req.pipe(parser).pipe(logger)

Any thoughts?

Piping to process.stdout doesn't work?

This code works:

var js = require("JSONStream")
var rq = require("request")
var stream=rq({url: "http://isaacs.couchone.com/registry/_all_docs"})
var parser=js.parse("rows.*")
parser.on("data", function (dato) {
console.log("\n----------------")
console.log(dato)
})
stream.pipe(parser)

This code doesn't work:

var js = require("JSONStream")
var rq = require("request")
var stream=rq({url: "http://isaacs.couchone.com/registry/_all_docs"})
var parser=js.parse("rows.*")
parser.on("data", function (dato) {
console.log("\n----------------")
})
stream.pipe(parser).pipe(process.stdout)

Why??

The error stack is:

net.js:611
throw new TypeError('invalid data');
^
TypeError: invalid data
at WriteStream.Socket.write (net.js:611:11)
at Stream.ondata (stream.js:51:26)
at Stream.EventEmitter.emit (events.js:126:20)
at drain (/home/q2dg/node_modules/through/index.js:36:16)
at Stream.stream.queue.stream.push (/home/q2dg/node_modules/through/index.js:45:5)
at Parser.parser.onValue (/home/q2dg/node_modules/JSONStream/index.js:88:16)
at Parser.proto.emit (/home/q2dg/node_modules/JSONStream/node_modules/jsonparse/jsonparse.js:326:8)
at Parser.proto.pop (/home/q2dg/node_modules/JSONStream/node_modules/jsonparse/jsonparse.js:321:8)
at Parser.proto.onToken as _onToken
at Parser.parser.onToken (/home/q2dg/node_modules/JSONStream/index.js:94:12)

Thanks!!

array

I just have an array like [a, b, c, d], I can't seem to get it to emit those, what would you pass to .parse()?

cheers

parser barfs on numbers in exponential notation

Test case:

var JSONStream = require('JSONStream'),
    parser = JSONStream.parse([/./]);
parser.end('{"foo": 1.33e+12}');

Results in this error:

Error: Unexpected "}" at position 16 in state NUMBER7
    at Parser.charError (/home/andreas/onewebdump/node_modules/JSONStream/node_modules/jsonparse/jsonparse.js:81:16)
    at Parser.write (/home/andreas/onewebdump/node_modules/JSONStream/node_modules/jsonparse/jsonparse.js:239:19)
    at Stream.write (/home/andreas/onewebdump/node_modules/JSONStream/index.js:76:12)
    at Stream.end (/home/andreas/onewebdump/node_modules/JSONStream/index.js:80:14)
    at Object.<anonymous> (/home/andreas/onewebdump/bug.js:5:8)
    at Module._compile (module.js:441:26)
    at Object..js (module.js:459:10)
    at Module.load (module.js:348:31)
    at Function._load (module.js:308:12)
    at Array.0 (module.js:479:10)

Where's 0.2.0

Where's version 0.2.0. Looks like you didn't push.

JSONStream > 0.10 fails on nested objects

This is what I get when I install [email protected]:

$ echo '{"bar":{"foo":"baz"}}' | node_modules/.bin/JSONStream '.bar.foo'
["baz"]

But this is what I get when I install [email protected] (or any other version after 0.10):

$ echo '{"bar":{"foo":"baz"}}' | node_modules/.bin/JSONStream '.bar.foo'
["baz"/usrdata/proj/scraping/instagram-screen-scrape/node_modules/JSONStream/index.js:71
            this.stack[j].value = null
                                ^

TypeError: Cannot set property 'value' of undefined
    at Parser.parser.onValue (/usrdata/proj/scraping/instagram-screen-scrape/node_modules/JSONStream/index.js:71:33)
    at Parser.proto.emit (/usrdata/proj/scraping/instagram-screen-scrape/node_modules/jsonparse/jsonparse.js:350:8)
    at Parser.proto.pop (/usrdata/proj/scraping/instagram-screen-scrape/node_modules/jsonparse/jsonparse.js:345:8)
    at Parser.proto.onToken (/usrdata/proj/scraping/instagram-screen-scrape/node_modules/jsonparse/jsonparse.js:416:12)
    at Parser.parser.onToken (/usrdata/proj/scraping/instagram-screen-scrape/node_modules/JSONStream/index.js:94:12)
    at Parser.proto.write (/usrdata/proj/scraping/instagram-screen-scrape/node_modules/jsonparse/jsonparse.js:101:34)
    at Stream.<anonymous> (/usrdata/proj/scraping/instagram-screen-scrape/node_modules/JSONStream/index.js:21:12)
    at Stream.stream.write (/usrdata/proj/scraping/instagram-screen-scrape/node_modules/through/index.js:26:11)
    at Socket.ondata (_stream_readable.js:525:20)
    at emitOne (events.js:77:13)

string notation for path

the docs mention using the array:

JSONStream.parse(['rows', true])

It always takes me a minute to figure out what that array even is again, can you pass "rows.*" etc? If not it would be cool to split/map those back into the array, the string syntax would be a nice alternative

Do not work on Windows

Hi,

I am having trouble on Windows.

npm i -g JSONStream
curl https://registry.npmjs.org/browserify | JSONStream 'versions.*.dependencies'

stream.js:74
      throw er; // Unhandled stream error in pipe.
      ^

Error: Invalid JSON (Unexpected "S" at position 4 in state STOP)
    at Parser.proto.charError (D:\npm\node_modules\JSONStream\node_modules\jsonparse\jsonparse.js:78:16)
    at Parser.proto.write (D:\npm\node_modules\JSONStream\node_modules\jsonparse\jsonparse.js:104:23)
    at Stream.<anonymous> (D:\npm\node_modules\JSONStream\index.js:23:12)
    at Stream.stream.write (D:\npm\node_modules\JSONStream\node_modules\through\index.js:26:11)
    at Socket.ondata (_stream_readable.js:528:20)
    at emitOne (events.js:77:13)
    at Socket.emit (events.js:169:7)
    at readableAddChunk (_stream_readable.js:146:16)
    at Socket.Readable.push (_stream_readable.js:110:10)
    at Pipe.onread (net.js:523:20)

At first I got the same error on react-stdio and opened this issue ReactTraining/react-stdio#2

missing stream.unpipe method

Hi,

I'm trying to .unpipe a JSONStream.

My code:

var fs = require('fs')
var stream = require('JSONStream').parse()
fs.createReadStream('package.json').pipe(stream)
stream.pipe(process.stdout)
stream.unpipe(process.stdout)

Expected result: nothing (the stream is not piped anywhere else)

Actual result:

TypeError: Object #<Stream> has no method 'unpipe'

Big numbers support

I've been working with the Vine API which uses very large integers for user ids (ex: 1114259415508639744). I'm using JSONStream to parse the results and am, predictably, having problems...

I know these numbers are a problem for JSON.parse() and JS generally, but reading your issue over with JSONparse, and its age and closed state, I'm wondering if I should still expect this behavior in JSONStream?

Int32Array bug for some (sometimes older) browser

Hi,

I have an issue with the Int32Array.
If the browser does not support typed array I guess it breaks.
In my case (chromium Version 23.0.1271.97 (171054) ), it breaks because the Int32Array does not have the "slice" method. (This breaks in the jsonparse module)

But I was wondering if we couldn't just remove the Int32Array stuff now that Buffer is browserify-able ?
If anyone knows a good typed-array polyfill, otherwise, it would be nice to notify it in the README so that you can take into account this requirement and work it around for older/weird browser, avoiding them to break.

Thanks !

creationix/jsonparse bumped to 0.0.5

Hey @dominictarr,

Thanks for the great module! When I was using it I discovered an underlying bug in jsonparse relating to utf8 encoding, and patched it. It has landed in 0.0.5. I pulled down JSONStream and updated the package.json to 0.0.5 and ran npm test, there is one error [Error: Unexpected RIGHT_BRACE("}") in state VALUE], but that is also present with jsonparse 0.0.4, so I think it's safe to update to 0.0.5. If you want I can send a PR for this.

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.