Comments (10)
confirmed the issue. it seems when the first file is empty and you try adding a second file, it just dies out. this seems to happen near the end of process when it should be writing out the Central Directory.
from node-archiver.
it would seem there is something up with the events when it comes to second file.. basically, the second file never fires off an end
event so it just dies out without triggering zip close events. not sure if this is a streams bug or what, very odd that the first file fires off and not the second.
from node-archiver.
basically from what i can tell is that fs.createReadStream
opens the file right away and starts emitting its events, so when the file is being opened so fast the second time around (guess its just a lot faster than the processing of first file), its actually already called end
(tested from actual calling script via file event hooks) so the event will never fire again (ie why it just dies out without finishing).
At this point, i'm not sure there is much that can be done from a stream handler standpoint, already call pause
right away in append
but it seems with fs.createReadStream
opening and reading right away, that has no effect. also since the file is zero byte there is no data buffer built up for pressure/back-flow system to work.
at this point in time, id suggest lazyStream
for filesystem streams. this may also effect other streams if it handles data loading as part of init with no data. at some point, i may implement an internal lazy loading system for passing function (as source) that is called internally right before processing but i've been trying to stay away from complicating code with such things.
from node-archiver.
Since grunt-contrib-compress 0.4.8
solves the issue, this is probably not a problem with archiver
then?
from node-archiver.
@F21 i'm going to keep this open and file an issue with core devs to see if my theory is correct. imho fs.createReadStream
or Readable
should prob handle this edge case as it happens so fast that most stream handlers wont be able to control it in time after being passed around specially since theres no data to keep its Readable
internal buffer from emitting all end events when fs.read
. does its push
of null
.due to the return of zero length data.
from node-archiver.
Yup, I think your theory is correct!
It doesn't just happen with zero-length files. For me, the zipping process will end prematurely if some of the files in the folder I want to zip are <10KB. Since I'm using eachLimit from the async library (https://github.com/caolan/async) to append all the files to the archive, setting the "parallel limit" to 1 so that .append runs only after the previous file has been appended solves the issue for me.
from node-archiver.
yah, I really hate using control flow wrappers when archiver can handle it internally. I really would like to see fs readStreams follow the new streams2 design closer and not read data before something requests it.
from node-archiver.
been doing some research though and a reread of docs. Im actually thinking the root of the issue maybe that right now streams are being paused/resumed by Archiver (need to confirm we need such logic now) which kicks stream it into old mode. I'm going to test without pause/resume in next release.
from node-archiver.
v0.4.2
will bring some improvements to avoid using data events and only use pause/resume when _ReadableState
isnt defined, this stores buffer in streams2 so if its doesn't exist we need to attempt pause, this should atleast help avoid the timing issues where possible.
from node-archiver.
closing as this appears to be resolved through recent changes.
from node-archiver.
Related Issues (20)
- CVE-2018-25046 "Zip Slip"?
- Zlib Memory Consumption
- Update dependency for `archiver-utils` (deprecation warning) HOT 2
- Node 20 tests failing with open handles
- Fails to stream to S3 using AWS SDK v3 HOT 14
- How to wait till the zipping completed HOT 2
- How to make the compression structure consistent with the win11 default compression structure
- 压缩后文件的修改时间不正确
- achiver.directory can't use, error with input source must be valid Stream or Buffer instance error HOT 1
- How to zip single file HOT 1
- Dependency Dashboard
- ESM HOT 1
- .directory() generate a empty folder, compress completed will remove the empty folder,could I retain the empty folder?
- Class extends value undefined is not a constructor or null HOT 1
- Is there a way to know how many files were ultimately compressed? HOT 2
- Question: Were there any API changes from 5.3 to 6.x? HOT 1
- Recommendation for an unzip library? HOT 1
- Overwrite existing files
- Missing Release of Resource after Effective Lifetime [High Severity] HOT 1
- zlib level 0 stops the progress of ongoing read streams
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from node-archiver.