Giter Site home page Giter Site logo

Comments (10)

ctalkington avatar ctalkington commented on May 26, 2024

confirmed the issue. it seems when the first file is empty and you try adding a second file, it just dies out. this seems to happen near the end of process when it should be writing out the Central Directory.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

it would seem there is something up with the events when it comes to second file.. basically, the second file never fires off an end event so it just dies out without triggering zip close events. not sure if this is a streams bug or what, very odd that the first file fires off and not the second.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

basically from what i can tell is that fs.createReadStream opens the file right away and starts emitting its events, so when the file is being opened so fast the second time around (guess its just a lot faster than the processing of first file), its actually already called end (tested from actual calling script via file event hooks) so the event will never fire again (ie why it just dies out without finishing).

At this point, i'm not sure there is much that can be done from a stream handler standpoint, already call pause right away in append but it seems with fs.createReadStream opening and reading right away, that has no effect. also since the file is zero byte there is no data buffer built up for pressure/back-flow system to work.

at this point in time, id suggest lazyStream for filesystem streams. this may also effect other streams if it handles data loading as part of init with no data. at some point, i may implement an internal lazy loading system for passing function (as source) that is called internally right before processing but i've been trying to stay away from complicating code with such things.

from node-archiver.

F21 avatar F21 commented on May 26, 2024

Since grunt-contrib-compress 0.4.8 solves the issue, this is probably not a problem with archiver then?

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

@F21 i'm going to keep this open and file an issue with core devs to see if my theory is correct. imho fs.createReadStream or Readable should prob handle this edge case as it happens so fast that most stream handlers wont be able to control it in time after being passed around specially since theres no data to keep its Readable internal buffer from emitting all end events when fs.read. does its push of null.due to the return of zero length data.

from node-archiver.

discretepackets avatar discretepackets commented on May 26, 2024

Yup, I think your theory is correct!

It doesn't just happen with zero-length files. For me, the zipping process will end prematurely if some of the files in the folder I want to zip are <10KB. Since I'm using eachLimit from the async library (https://github.com/caolan/async) to append all the files to the archive, setting the "parallel limit" to 1 so that .append runs only after the previous file has been appended solves the issue for me.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

yah, I really hate using control flow wrappers when archiver can handle it internally. I really would like to see fs readStreams follow the new streams2 design closer and not read data before something requests it.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

been doing some research though and a reread of docs. Im actually thinking the root of the issue maybe that right now streams are being paused/resumed by Archiver (need to confirm we need such logic now) which kicks stream it into old mode. I'm going to test without pause/resume in next release.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

v0.4.2 will bring some improvements to avoid using data events and only use pause/resume when _ReadableState isnt defined, this stores buffer in streams2 so if its doesn't exist we need to attempt pause, this should atleast help avoid the timing issues where possible.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

closing as this appears to be resolved through recent changes.

from node-archiver.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.