Giter Site home page Giter Site logo

xml-stream's People

Contributors

alexhenrie avatar artazor avatar assistunion avatar dgreisen avatar disolovyov avatar dturing avatar jfhbrook avatar jmgunn87 avatar nyxtom avatar punund avatar stuartpb avatar trdarr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

xml-stream's Issues

http-stream.js

I'm not sure if this is where to write this; but I am having issues with xml-stream with HTTP.

I have tried to use http-stream.js but due to the RSS feed not being able to be read (Twitter do not allow this rss link anymore) I can not work out how to do it.

xml.on("data") is not getting run (testing just using console.log("hi").

So all I can receive is the information passed with the webpage rather than including the data that it would display (the xml)

The XML link I am trying to use is http://cloud.tfl.gov.uk/TrackerNet/LineStatus

Any help would be greatly appreciated

need negation

Need a negation symbol for selecting tags. Something like endElement: SECTION !CHAPTER HD which would fire an event for any <HD> that are in a <SECTION> but not in a CHAPTER. I do not believe css has a standard negation selector. Probably because you can accomplish negation through cascading.

Use case:
Say I have XML like the following:


<... several unknown and variable levels deep>
Section Title
</...>

Chapter Title

I wish to create a listener to capture only <HD>Section Title</HD>. I do not know the exact path between <SECTION> and <HD>. If I use the selector SECTION HD I will also get the <HD>Chapter Title</HD>, which is undesirable.

null container causing a TypeError exception

A null ref exception is thrown occasionally when I'm parsing a well-formed xml file. I don't know how to reproduce this since it doesn't happen all the time. My code uses stream pause/resume a lot, so maybe there's a timing component to this bug. Logging here as an fyi to the author.

/home/tim/parser/node_modules/xml-stream/lib/xml-stream.js:430
      container[container.length - 1] = val;
                         ^
TypeError: Cannot read property 'length' of undefined
    at null.<anonymous> (/home/tim/parser/node_modules/xml-stream/lib/xml-stream.js:430:26)
    at EventEmitter.emit (events.js:95:17)
    at Parser.parse (/home/tim/parser/node_modules/xml-stream/node_modules/node-expat/lib/node-expat.js:23:22)
    at parseChunk (/home/tim/parser/node_modules/xml-stream/lib/xml-stream.js:513:14)
    at ReadStream.<anonymous> (/home/tim/parser/node_modules/xml-stream/lib/xml-stream.js:533:11)
    at ReadStream.EventEmitter.emit (events.js:95:17)
    at ReadStream.<anonymous> (_stream_readable.js:746:14)
    at ReadStream.EventEmitter.emit (events.js:92:17)
    at emitReadable_ (_stream_readable.js:408:10)
    at emitReadable (_stream_readable.js:404:5)

Newlines removed by parser

We use this library to parse xml feeds which feed data into our API. Some of this data is markdown and it has multiple newlines, however these newlines are being removed during parsing. This is breaking our markdown.

Escaping invalid characters

I have an issue while parsing an XML file:

events.js:72
        throw er; // Unhandled 'error' event
              ^
Error: not well-formed (invalid token) in line 52584
    at parseChunk (/home/namlook/Documents/Projects/kalitmo/node_modules/xml-stream/lib/xml-stream.js:507:26)
    at ReadStream.<anonymous> (/home/namlook/Documents/Projects/kalitmo/node_modules/xml-stream/lib/xml-stream.js:514:7)
    at ReadStream.EventEmitter.emit (events.js:95:17)
    at ReadStream.<anonymous> (_stream_readable.js:746:14)
    at ReadStream.EventEmitter.emit (events.js:92:17)
    at emitReadable_ (_stream_readable.js:408:10)
    at emitReadable (_stream_readable.js:404:5)
    at readableAddChunk (_stream_readable.js:165:9)
    at ReadStream.Readable.push (_stream_readable.js:127:10)
    at onread (fs.js:1561:12)

at the line 52584, I have the character "\x1d". I didn't found a way to make xml-stream ignore this character and continue parsing...

Xml-Stream is a great piece of software but for now I'm stuck with this issue. Any ideas ?

Memory leak possibly?

I'm having trouble tracking this one down, but it looks like when going through a very large xml file (+1GB), that the allocated space by the node process is roughly the size of the file as it is being processed over time. Is there something I'm doing wrong? Should the memory be freed up as I process new items over time that way old items I already read get unallocated from memory? What can I do to determine if this is truly the case?

Stream overflow

Hello and thanks for your work.
I'm parsing 750M xml file consist small xml-object. In five minutes of parsing(near 15k records) it's get slow and parse a record in a second. And falls from memory leak after half of hour. I tried to force GC, but it doesn't help. So the problem was in size of file. Can I, in a some way, remove a head of stream? Or make a throttle of stream in some way?
I try to split stream, change bufferSize but nothing helps. Maybe recreating a stream with moving start position will help? Stream.unshift?

Any help appreciated

Return unparsed XML

Is there any way to get a chunk of unparsed XML?

For example, if I'm using xml.on('endElement: record', function(record) { });, record will give me all the parsed data. I'm looking for a way to get the entire <record></record> element’s raw XML.

Parent > child selector is not a strict 1:1 mapping

I have a map which has nested items e.g.

<root>
    <nestedItems>
        <item>...</item>
        <item>...</item>
    </nestedItems>
    <item>...</item>
</root>

I am trying to catch the startElement event for root > item (I'm not interested in any nested item elements) but it still seems to raise an event for all instances.

XMLStream should be a true Stream object

The lib should be re-factored to become a "true" Stream object so that we can pipe a stream into XMLStream. I've done initial work which I know works with a simple example, but there's a lot of more rework that needs to be done in order for all the functionality to work. Also, most likely, it'd be nice if it still worked to use the library using the old API.

What I want is for it to work like this (for example):

var xmlstream = new XMLStream(); // Options can (and should?) be passed here rather than by calling functions further down the road

fs.createReadStream('my.file').pipe(xmlstream);

... // Do business as usual with .on('endElement: item', function ( ... ) { ... })

You can find my initial work at https://github.com/cjblomqvist/xml-stream in my stream branch: https://github.com/cjblomqvist/xml-stream/tree/stream

Let me know if you want me to do a pull request. At this point I thought my branch was in such bad condition it wasn't preferable yet.

Passing Readable Streams - v0.10.29 works, v0.10.30 don't

Adapting the example for a sample usage of creating a new readable stream and passing to the parser:

var fs = require('fs')
  , filename = require('path').join(__dirname, 'collect-preserve.xml')
  ,     XmlStream = require('../lib/xml-stream')
  , File = require('vinyl')
  , fromArray = require('read-stream/array')
  , Readable = require('stream').Readable;


function createStream (data) {
    var stream = new Readable({objectMode: true});

    stream.push(data);
    stream.push(null);

    return stream;
}

var file = new File({contents: new Buffer(fs.readFileSync(filename))});
var stream = fromArray([file.contents]);
var results = [];

var xml = new XmlStream(stream);

xml.preserve('item', true);
xml.collect('subitem');
xml.on('endElement: item', function(item) {
  results.push(item);
});

xml.on('end', function () {
    console.log(results);
});

This works on v0.10.29 (prints the results array filled, as expected) but on v0.10.30 it prints an empty list.

ps.: it works well when passing directly the result of fs.createReadStream. I'm actually trying to pass a stream that i create with data 'on the fly'.

Any suggestions on this?

Problem with collecting children

I have the following structure:

<images>
  <image-group view-type="large">
    <image path="foo.png"/>
    <image path="bar.png">
    <image path="foobar.jpg"/>
  </image-group>
</images>

I am unable to collect all of the image nodes, I only get the last one in each image-group node.

I have tried all kinds of collect selectors but none work.

Can you help please?

preserving mixed content when updateElement

I would like to chunk/update a xml file item by item. the item nodes can have mixed content:
thisfoo(1)isfoo(2)textfoo(3)content
mixed content should be preserved for my application.

with version 0.3.2:
if not in collect mode
multiple subitems with identical names are overwritten by the last one.
if in collect mode
subitems are preserved but
text in mixed content nodes is collected part by part
so original ordering of mixed content cannot reconstructed in updateElement callback

the use of finite automaton is very sophisticated. I will try to modify xml.endElement and xml.text, e.g. introducing an array for mixed content, but I am not shure whether it's so easy ;-)

regards
Peter

Possible to collect everything?

This is a really great library. I'm parsing an XML file of 300MB with no issues. Just a question though: is there a way to collect everything? The XML I'm parsing is generated by someone else, and I'd prefer collection to happen by default, such that I don't have to pre-determine each element that could possibly have multiple entries.

http-stream.js will stop working soon

The patch I submitted for http-stream.js (#20) changed the path to the RSS feed to use the V1 API, since Twitter removed support for RSS through the main site.

Twitter is going to deactivate the V1 API on March 5, 2013. The 1.1 API requires authentication for all requests and does not support RSS (only JSON).

To continue to serve as a working example, the script will need to be rewritten before that happens.

Possible replacements:

Thoughts?

stream bz xml file

Hi,

I need to parse wikipedia dumps for research purpose. The dumps are very very big (much more than 1TB). These are the file "page-meta-history.xml" -> http://dumps.wikimedia.org
So I can't unzip the file. Can I word with bz file into the xml-steam API ?

Thank you

won't install due to node-expat using node-wav

This is the infamous node-expat using node-wav install error, which is still in the latest npm version.

You have 0.4.2 as master, while 0.4.1 is in npm.

This npm version is nowhere to be found on github, so I can't look into it's package.json what versions of iconv and node-expat it's depending on.

Where has 0.4.1 gone, and when will you publish 0.4.2 on npm?

EINVAL, Incomplete character sequence

When parsing the below data, the first <p> node parses, but the second fails with the following error. I can't tell what the difference between the two nodes is. The library looks great otherwise!

/Users/seanhess/itv/telus/node_modules/xml-stream/lib/xml-stream.js:478
      data = self._encoder.convert(data);
                           ^
Error: EINVAL, Incomplete character sequence.
    at /Users/seanhess/itv/telus/node_modules/xml-stream/lib/xml-stream.js:478:28
    at [object Object].<anonymous> (/Users/seanhess/itv/telus/node_modules/xml-stream/lib/xml-stream.js:488:7)
    at [object Object].emit (events.js:67:17)
    at [object Object]._emitData (fs.js:1155:10)
    at afterRead (fs.js:1137:10)
    at Object.wrapper [as oncomplete] (fs.js:254:17)



<p id="391492" t="Rumeurs" rt="Rumeurs" d="Esther se réconcilie avec Vincent, mais elle se demande si elle a pris la bonne décision." rd="Esther se réconcilie avec Vincent, 
mais elle se demande si elle a pris la bonne décision." et="Ma vie est un téléroman">
<f id="2"/>
<k id="1" v="532931"/>
<k id="2" v="19"/>
<k id="6" v="20030409"/>
<k id="10" v="Program"/>
<c id="403"/></p>
<p id="391493" t="Rumeurs" rt="Rumeurs" d="Esther a décidé de faire une pause dans sa relation avec Vincent; Clara accepte de poser nue pour Charles." rd="Esther fait une pause dans sa relation; Clara accepte de poser nue." et="Ma vie est un téléroman">
<f id="2"/>
<k id="1" v="532931"/>
<k id="2" v="20"/>
<k id="6" v="20030416"/>
<k id="10" v="Program"/>
<c id="403"/></p>

NPM

Can you please update NPM? I'd rather not use the github url...

Thanks!

Element text

Hi all,

First of all thank you for the software, I've been putting it to good use :D

I would like to know how is it possible to reproduce your example that says "By default, element text is returned as one concatenated string." element.$text gives out undefined in my setup. I tried preserving the stuff but it doesn't change the fact that there is no $text.

If it's not really clear what I meant: I would like the text of all the children combined. Is that possible with xml-stream or do I have to do it myself?

Thanks!

If upstream fails, error is not propagated

Is it expected behaviour?

We get error:

events.js:141
      throw er; // Unhandled 'error' event
      ^

Test case:

var XmlStream = require('xml-stream');
test('Rejects if upstream fails', function (t) {
  var Stream = function (opt) {
    Readable.call(this, opt);
  };
  util.inherits(Stream, Readable);
  Stream.prototype._read = function () {
    this.emit('error', new Error('OMG!'));
  };
  var stream = new Stream();

  var xml = new XmlStream(stream);
  xml.on('error', t.pass);
  xml.on('error', function () {
    t.end();
  });
});

Fix:

var XmlStream = require('xml-stream');
test('Rejects if upstream fails', function (t) {
  var Stream = function (opt) {
    Readable.call(this, opt);
  };
  util.inherits(Stream, Readable);
  Stream.prototype._read = function () {
    this.emit('error', new Error('OMG!'));
  };
  var stream = new Stream();

  var xml = new XmlStream(stream);

  // Propagate error
  stream.on('error', xml.emit.bind(xml, 'error'));

  xml.on('error', t.pass);
  xml.on('error', function () {
    t.end();
  });
});

I'll submit a PR bit later.

Is there anyway to turn collect on by default or for all nodes.

'By default, parsed element node contains children as properties. In the case of several children with same names, the last one would overwrite others. To collect all of subitem elements in an array use'. I would like this to happen for all nodes but specifying every node would be time consuming.

Node 8 -CRASH-

When I hit the require for xml-stream node crashes

FATAL ERROR: v8::ToLocalChecked Empty MaybeLocal.
 1: node::Abort() [/usr/local/bin/node]
 2: node::FatalException(v8::Isolate*, v8::Local<v8::Value>, v8::Local<v8::Message>) [/usr/local/bin/node]
 3: v8::V8::ToLocalEmpty() [/usr/local/bin/node]
 4: node::inspector::(anonymous namespace)::CallAndPauseOnStart(v8::FunctionCallbackInfo<v8::Value> const&) [/usr/local/bin/node]
 5: v8::internal::FunctionCallbackArguments::Call(void (*)(v8::FunctionCallbackInfo<v8::Value> const&)) [/usr/local/bin/node]
 6: v8::internal::MaybeHandle<v8::internal::Object> v8::internal::(anonymous namespace)::HandleApiCallHelper<false>(v8::internal::Isolate*, v8::internal::Handle<v8::internal::HeapObject>, v8::internal::Handle<v8::internal::HeapObject>, v8::internal::Handle<v8::internal::FunctionTemplateInfo>, v8::internal::Handle<v8::internal::Object>, v8::internal::BuiltinArguments) [/usr/local/bin/node]
 7: v8::internal::Builtin_Impl_HandleApiCall(v8::internal::BuiltinArguments, v8::internal::Isolate*) [/usr/local/bin/node]
 8: 0x66fb710437d
 9: 0x66fb72d0969
10: 0x66fb72ca823

Problems with special encoded character

I parse a big xml file (700mo) and in one line i've a special character : &#12;
And when xml-stream reach this line, it fails with this error:

events.js:141549
      throw er; // Unhandled 'error' event
      ^

Error: reference to invalid character number in line 12482025
    at parseChunk (/Users/.../node_modules/xml-stream/lib/xml-stream.js:514:26)
    at ReadStream.<anonymous> (/Users/.../node_modules/xml-stream/lib/xml-stream.js:521:7)
    at emitOne (events.js:77:13)
    at ReadStream.emit (events.js:169:7)
    at readableAddChunk (_stream_readable.js:146:16)
    at ReadStream.Readable.push (_stream_readable.js:110:10)
    at onread (fs.js:1744:12)
    at FSReqWrap.wrapper [as oncomplete] (fs.js:576:17)

Any idea ?

failed to map segment from shared object

Good day,

Getting the following error right after "var xmlstream = require('xml-stream')":

"/var/www/testapp/node_modules/xml-stream/node_modules/node-expat/node_modules/bindings/bindings.js:83
throw e
^

Error: /var/www/testapp/node_modules/xml-stream/node_modules/node-expat/build/Release/node_expat.node: failed to map segment from shared object: Operation not permitted"

node: v4.2.2
npm: 2.14.7
CentOS release 6.6
Linux 2.6.32-504.23.4.el6.x86_64 x86_64
[email protected] node_modules/xml-stream
├── [email protected] ([email protected], [email protected], [email protected], [email protected])
├── [email protected] ([email protected], [email protected])
└── [email protected] ([email protected])

Thank you

Linear XML not firing events

Hello,

I was working with a system generated XML file that didn't have any line breaks or indents. XML-Stream wasn't able to determine the events and find the ending tag for each section. Once I added line breaks and indents, it worked just fine. Seems like it would have been able to find a tag regardless, unless there's some option that needs to be set for that to happen.

Thanks!

xml-stream isn't detecting events in short xml documents

I'm using xml-stream inside a nodejs worker, retrieving xml using the request package. For some reason, I can pull data from moderately large files, but not from short ones. On short documents, the only event to be called is 'end'.

I've verified that the documents are coming through, but haven't been able to figure out why the events aren't firing.

request.get(options)
.on('error', function(err){
  console.log(err);
  job.fail();
  cb();
})
.on("response", function(response, body){
  var allEvents = [];
  response.setEncoding('utf8');

  var xml = new xmlStream(response);

  xml.collect('r25:item');
  xml.collect('r25:id');
  xml.collect('r25:name');

  xml.on('endElement: r25:item', function(data){
    console.log("Found Event");
    var thisEvent = 0;
    allEvents.push(thisEvent);
  })

  xml.on('end', function(data){
    console.log("EventCount", allEvents.length);
    console.log();
    job.done();
    cb();
  });
})

Memory leak on large files

When parsing a huge file (hundreds of megabytes), it leaks memory.

  1. Burn through a huge file.
  2. Make sure to collect some of the nodes.
  3. Watch the memory usage go up
  4. Pause the stream after a while
  5. Memory never goes back down, even when paused.

check for well-formedness

This isn't an issue but a feature request. Since xml-stream depends on expat, is it possible to execute xmlwf on a stream?

Thanks.

Non-top level nodes

I have some xml which is broken down like this

<rootNode>
  <header>...</header>
  <payload>
    <item>...</item>
    <item>...</item>
    ...
  </payload>
</rootNode>

What I'd like to do is get the header elements and then also all the items... however, it isn't clear to me that the module will even support that.

I ended up reading the header elements from the first few lines with a manual process, then attempting to read the <item> elements with xml-stream:

 xmlStream.preserve('item');
 xmlStream.on('endElement: item', function (d) { ...

But this won't work.

I've successfully used xml-stream to pull data from XML which is merely a repetition of items inside a root item with no problem whatsoever.

Does anyone know how I can simply / elegantly get my data?

$text missing in selected elements with no children

Given input and code:

<item>
  <subitem>one</subitem>
  <subitem>two</subitem>
</item>
var stream = require('fs').createReadStream('sample.xml');
var XmlStream = require('xml-stream');
var xml = new XmlStream(stream);
xml.on('updateElement: subitem', function(item) {
  console.log(item);
});

It currently outputs:

{}
{}

But should output:

{ '$text': 'one' }
{ '$text': 'two' }

Wildcards?

May be missing something obvious here. But suppose I have the following xml:

<fruit>
   <banana>3</banana>
   <apple>5</apple>
   <orange>6</orange>
</fruit>

And I wanted all children of the fruit element, but I don't know ahead of time what each possible fruit is, someone could throw in guava and I'd have to respond to that. According to CSS selector spec:

http://www.w3.org/TR/css3-selectors/#universal-selector

fruit > *

should work, but doesn't.

Let's make it simpler and say I want endElement to fire on ANY element in the XML:

xmlStream.on('endElement: *', (item) -> console.log(item))

gives me nothing.

Are wildcards supported at all?

Does not really stream, keeps objects in memory

Used it for some time until we found node process crashes because of too much memory usage.

After deeper investigation we realized that "streaming" 250Mb XMLs through xml-stream is not possible. Somewhere in the first third of it node crashes.

Every time new chunk of interesting data was found mem increased by 1M and never went back.

I did not investigate what exactly leak memory and switched directly to node-expat, now node processing 350Mb XML takes up to 400Mb mem max (depends on the largest object passed through and usually is up to 50Mb with rare spikes), but it never grows and actually streams.

I am not sure if I have time to have a deeper look at the code and figure out what exactly holds extra objects.

Might be related to #16

Installation Problems

Hey Guys,

When I try to install (fresh server) I get the following output:

root@s16009678:/home/john# npm install xml-stream -g
npm http GET https://registry.npmjs.org/xml-stream
npm http 304 https://registry.npmjs.org/xml-stream
npm http GET https://registry.npmjs.org/iconv
npm http GET https://registry.npmjs.org/node-expat
npm http 304 https://registry.npmjs.org/node-expat
npm http 304 https://registry.npmjs.org/iconv
npm WARN [email protected] dependencies field should be hash of <name>:<version-range> pairs
npm WARN [email protected] devDependencies field should be hash of <name>:<version-range> pairs

> [email protected] install /usr/lib/node_modules/xml-stream/node_modules/node-expat
> node-waf configure build

Checking for program g++ or c++          : /usr/bin/g++
Checking for program cpp                 : /usr/bin/cpp
Checking for program ar                  : /usr/bin/ar
Checking for program ranlib              : /usr/bin/ranlib
Checking for g++                         : ok
Checking for node path                   : ok /usr/lib/node_modules/
Checking for node prefix                 : ok /usr
'configure' finished successfully (0.020s)
Waf: Entering directory `/usr/lib/node_modules/xml-stream/node_modules/node-expat/build'
[1/2] cxx: node-expat.cc -> build/Release/node-expat_1.o
../node-expat.cc:3:25: error: node_events.h: No such file or directory
../node-expat.cc:17: error: expected class-name before â{â token
../node-expat.cc: In static member function âstatic void Parser::Initialize(v8::Handle<v8::Object>)â:
../node-expat.cc:24: error: âEventEmitterâ has not been declared
../node-expat.cc: In static member function âstatic v8::Handle<v8::Value> Parser::New(const v8::Arguments&)â:
../node-expat.cc:63: error: âclass Parserâ has no member named âWrapâ
../node-expat.cc: In constructor âParser::Parser(const XML_Char*)â:
../node-expat.cc:68: error: class âParserâ does not have any field named âEventEmitterâ
../node-expat.cc: In static member function âstatic void Parser::StartElement(void*, const XML_Char*, const XML_Char**)â:
../node-expat.cc:264: error: âclass Parserâ has no member named âEmitâ
../node-expat.cc: In static member function âstatic void Parser::EndElement(void*, const XML_Char*)â:
../node-expat.cc:274: error: âclass Parserâ has no member named âEmitâ
../node-expat.cc: In static member function âstatic void Parser::StartCdata(void*)â:
../node-expat.cc:283: error: âclass Parserâ has no member named âEmitâ
../node-expat.cc: In static member function âstatic void Parser::EndCdata(void*)â:
../node-expat.cc:292: error: âclass Parserâ has no member named âEmitâ
../node-expat.cc: In static member function âstatic void Parser::Text(void*, const XML_Char*, int)â:
../node-expat.cc:302: error: âclass Parserâ has no member named âEmitâ
../node-expat.cc: In static member function âstatic void Parser::ProcessingInstruction(void*, const XML_Char*, const XML_Char*)â:
../node-expat.cc:312: error: âclass Parserâ has no member named âEmitâ
../node-expat.cc: In static member function âstatic void Parser::Comment(void*, const XML_Char*)â:
../node-expat.cc:322: error: âclass Parserâ has no member named âEmitâ
../node-expat.cc: In static member function âstatic void Parser::XmlDecl(void*, const XML_Char*, const XML_Char*, int)â:
../node-expat.cc:335: error: âclass Parserâ has no member named âEmitâ
../node-expat.cc: In static member function âstatic void Parser::EntityDecl(void*, const XML_Char*, int, const XML_Char*, int, const XML_Char*, const XML_Char*, const XML_Char*, const XML_Char*)â:
../node-expat.cc:353: error: âclass Parserâ has no member named âEmitâ
Waf: Leaving directory `/usr/lib/node_modules/xml-stream/node_modules/node-expat/build'
Build failed:  -> task failed (err #1):
        {task: cxx node-expat.cc -> node-expat_1.o}
npm ERR! error installing [email protected]
npm ERR! error installing [email protected]

npm ERR! [email protected] install: `node-waf configure build`
npm ERR! `sh "-c" "node-waf configure build"` failed with 1
npm ERR!
npm ERR! Failed at the [email protected] install script.
npm ERR! This is most likely a problem with the node-expat package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR!     node-waf configure build
npm ERR! You can get their info via:
npm ERR!     npm owner ls node-expat
npm ERR! There is likely additional logging output above.
npm ERR!
npm ERR! System Linux 3.0.0-14-server
npm ERR! command "node" "/usr/bin/npm" "install" "xml-stream" "-g"
npm ERR! cwd /home/john
npm ERR! node -v v0.6.8
npm ERR! npm -v 1.1.0-2
npm ERR! code ELIFECYCLE
npm ERR! message [email protected] install: `node-waf configure build`
npm ERR! message `sh "-c" "node-waf configure build"` failed with 1
npm ERR! errno {}

npm ERR! Error: ENOENT, no such file or directory '/usr/lib/node_modules/xml-stream/node_modules/___iconv.npm/package/deps/libiconv/libcharset/tools/hpux-11.00'
npm ERR! You may report this log at:
npm ERR!     <http://github.com/isaacs/npm/issues>
npm ERR! or email it to:
npm ERR!     <[email protected]>
npm ERR!
npm ERR! System Linux 3.0.0-14-server
npm ERR! command "node" "/usr/bin/npm" "install" "xml-stream" "-g"
npm ERR! cwd /home/john
npm ERR! node -v v0.6.8
npm ERR! npm -v 1.1.0-2
npm ERR! path /usr/lib/node_modules/xml-stream/node_modules/___iconv.npm/package/deps/libiconv/libcharset/tools/hpux-11.00
npm ERR! fstream_path /usr/lib/node_modules/xml-stream/node_modules/___iconv.npm/package/deps/libiconv/libcharset/tools/hpux-11.00
npm ERR! fstream_type File
npm ERR! fstream_class FileWriter
npm ERR! code ENOENT
npm ERR! message ENOENT, no such file or directory '/usr/lib/node_modules/xml-stream/node_modules/___iconv.npm/package/deps/libiconv/libcharset/tools/hpux-11.00'
npm ERR! errno {}
npm ERR! fstream_stack Object.oncomplete (/usr/lib/node_modules/npm/node_modules/fstream/lib/writer.js:204:26)
npm ERR!
npm ERR! Additional logging details can be found in:
npm ERR!     /home/john/npm-debug.log
npm not ok

I figured it was something to do with node-expat so I installed the latest version of that:

root@s16009678:/home/john# npm install node-expat -g
npm http GET https://registry.npmjs.org/node-expat
npm http 304 https://registry.npmjs.org/node-expat

> [email protected] install /usr/lib/node_modules/node-expat
> node-waf configure build

Checking for program g++ or c++          : /usr/bin/g++
Checking for program cpp                 : /usr/bin/cpp
Checking for program ar                  : /usr/bin/ar
Checking for program ranlib              : /usr/bin/ranlib
Checking for g++                         : ok
Checking for node path                   : ok /usr/lib/node_modules/
Checking for node prefix                 : ok /usr
Checking for header expat.h              : yes
'configure' finished successfully (0.080s)
Waf: Entering directory `/usr/lib/node_modules/node-expat/build'
[1/2] cxx: node-expat.cc -> build/Release/node-expat_1.o
[2/2] cxx_link: build/Release/node-expat_1.o -> build/Release/node-expat.node
Waf: Leaving directory `/usr/lib/node_modules/node-expat/build'
'build' finished successfully (0.202s)
[email protected] /usr/lib/node_modules/node-expat

I think [2/2] being yellow means it has a warning.. but it all reports ok.

I tried npm install xml-stream again and it gave be the same warning as before.

This is my first experience with node, so I am probably missing something. Any chance you could help me get this working because I'm pretty confused?

Cheers,

John

With N MyElement elements and M EndEvent handlers calls NxM times each handler.

with this xml:
<root> <MyElement>hi<MyElement> <MyElement>bye<MyElement> </root>
and this handlers:
`
xml.on('EndElement: MyElement', (myElement) => {
console.log('detected 1');
});

xml.on('EndElement: MyElement', (myElement) => {
console.log('detected 2');
});
`

the console will have:
detected 1
detected 2
detected 1
detected 2

tag names with underscore (_)

Hi,

Tag names with underscore like these:

<things>
    <say_this>
        Hello
    </say_this>
<things>

aren't parsed. Regex seems to match tag names. Can you please help?

I am using on method as follows:
xml.on("startElement: things > say_this", function(entityTag) {
});

Thanks!

File stream ends before node-expat stream ends

It seems when you use pause and resume functions to run async operations when you get an endElement event, xml-stream continues reading. I sometimes have problems when readable file stream ends before node-expat stream completes.

In my code, I transform detected endElement objects and write them asynchronously to a file using pause and resume xml-stream methods during write operation. In large files, when I handle end event I end the writable file stream. But after that, I get more endElement events and I get next error (cannot write in a closed writable stream).

Error: write after end
    at writeAfterEnd (_stream_writable.js:166:12)
    at WriteStream.Writable.write (_stream_writable.js:211:5)

I am working in a PR to mitigate this error.

Bump node-expat again?

Disclaimer that I am running node 1.11

node-expat dependency is not installing correctly, guessing my version of npm is causing issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.