Giter Site home page Giter Site logo

Comments (32)

ctalkington avatar ctalkington commented on May 26, 2024

work is underway on this. will need testers within a few weeks.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

so i implemented new header tests a few weeks back, caught up in work since but will be rebasing the changes into the parse branch next chance I get and will hopefully be able to roll out that feature by early July with the appending to existing archives coming in a follow up release.

from node-archiver.

kirbysayshi avatar kirbysayshi commented on May 26, 2024

Is there any plan to make this feature happen? Not sure if this ticket is still relevant since there is no parse branch.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

i had removed the branch as it got out of date with the overhaul to the generator side of things. im not sure when it will be revisited but there are a few libs depending on what you need to parse.

https://github.com/bower/decompress-zip
https://github.com/mafintosh/tar-stream

from node-archiver.

kirbysayshi avatar kirbysayshi commented on May 26, 2024

I ended up going with https://github.com/EvanOxfeld/node-unzip since it has stream support. But then I ran into a weird issue EvanOxfeld/node-unzip#47 (comment).

It's only for an integration test so it's ok, but I would love read support in node-archiver!

from node-archiver.

silverwind avatar silverwind commented on May 26, 2024

+1 for unzipping support, if it's in scope for this module, of course.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

thanks for voicing the features you would like to see. ill def keep it in mind.

from node-archiver.

nukulb avatar nukulb commented on May 26, 2024

+1 for unzipping support and add/deleting entries from an existing zip

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

if i did ever add it, id prob just wrap the below module.

https://github.com/bower/decompress-zip

from node-archiver.

andrewrk avatar andrewrk commented on May 26, 2024

bower/decompress-zip#27 :(

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

ah, hadn't explored it that deep. i tend to use lazystream wrapper to work around EMFILE personally.

from node-archiver.

nukulb avatar nukulb commented on May 26, 2024

I am using adm-zip for the unzipping and it works well so far but it hasn't had a lot of testing yet.

Right now I have to download a zip, unzip it, delete and add files, zip again, and upload

Ideally I was hoping adm-zip would just do it all as the functionality is suppose to work but it just didn't work out of the box for adding files to an existing zip.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

ic, at best if i include such a feature itll be a wrapper around existing module. id rather contribute to other modules than have to maintain a whole new set of code.

from node-archiver.

andrewrk avatar andrewrk commented on May 26, 2024

I looked at adm-zip today too. Found some things that made me uncomfortable:

My conclusion is that node.js does not have an unzipping module that meets a minimum level of quality.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

def sad to see. zip generation was kinda the same when i made archiver. original zipstream was around but un-maintained by its dev and there were a few general js ones but nothing heavily optimized for node and streams.

from node-archiver.

andrewrk avatar andrewrk commented on May 26, 2024

Thanks for that, by the way. I was really happy to see archiver when I revisited the problem about a year ago.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

its been a good project. have learnt a lot and have made things more modular in the past few months. zip-stream now exists for those who want to go at things on their own. using tar-stream instead of maintaining custom tar implementation (also has pax header support). just recently directory and bulk support was added.

listing and appending is def on the list of thing id like to implement. just dont want to rush in an implementation as its going to require some API changes and if i do it, i would want all default modules to support same logic (ie zip, tar, tgz). there are also considerations like how to handle multiple files of same name.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

ok so a little feedback would help on naming of the possible methods and few other aspects.

list - to list contents given a stream, buffer, or filepath of archive type.

looking at how decompress-zip only supports a file, we can either help them support other input or write temp file with stream or buffer.

import - to create a new instance given a stream, buffer, or filepath of archive type.

looking at how decompress-zip only supports extracting to file. again might need to work with them as they have done a lot of the legwork to parsing multiple variations of zip data.

a) i could see a decent queue building up if importing a huge file. may not be issue with streaming though.
b) will need to store file data by key like their name as to avoid duplicates.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

the other option is to reuse some of their work, since MIT license. i do see supporting streams input to be a bit of a nightmare, as you will need to do alot of reading ahead.

EDIT: if i did this, itd be to make a more universal zip parser that doesnt care much about source.

from node-archiver.

andrewrk avatar andrewrk commented on May 26, 2024

the .zip file would reside on disk, but reading an entry would be a stream, yes?

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

with decompress-zip they built it around always being a file input and fs outputs. this is why im not sure we could use their work without heavily contributing back changes.

so while doing stream/buffer input to temp files would work for parsing, we'd have to build our own logic to read the file content back into archiver. thus im leaning towards taking some their work and generalizing it in to a parser module that only uses buffers and then consuming modules can force inputs to such as they desire.

EDIT: essentially, id want a parser that returns me a contents() dynamic getter which reads on demand from source, along side the actual parsed file props.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

good example of end result id like see passed back is:

https://github.com/wearefractal/vinyl

I believe, id make a vinyl adapter that represents a zip file record.

from node-archiver.

silverwind avatar silverwind commented on May 26, 2024

For an approach to reading a file list vs. the whole contents, I'd probably opt for something like a read method that reads just as much bytes as needed to present an array of filenames, similar to the output of fs.readdir, but probably with the path included. Parsing should probably stop at this point, until an extract method gets called, which then returns some form of in-memory file representation, maybe vinyl, but I'd prefer something that can be easily handled over to fs.

Alternatively, one could go the way of events too, and emit an event when the TOC of the archive has been parsed, and then provide a method to stop any further parsing at this point, until the application decides what to extract exactly.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

just wanted to let everyone know this is going to be on my radar to look at next month. currently booked up with work and will be dealing with moving in the next couple weeks.

from node-archiver.

andrewrk avatar andrewrk commented on May 26, 2024

My use case for this feature would be happy if I could do the following:

  • Pipe a download stream into an unzipper
  • Extract all files to a temp directory

That's it. So ideally I would not have to save the .zip file, it could be a stream, and I would end up with all the files in a temp directory somewhere which I could then do what I wish with.

from node-archiver.

silverwind avatar silverwind commented on May 26, 2024

Yep, my case is exactly the same. I'm fine with the end result as files on the disk, and I think having a some sort of in-memory file format would probably be overkill. I'd just like to have the option to only extract what I need from the zip (select files by name, size or date).

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

thanks for the feedback. i do think parsing will be first part i look into. most likely will be two part: zip-stream and archiver (invoker) as tar-stream already has a way to extract.which uses events. any thoughts on that style?

from node-archiver.

andrewrk avatar andrewrk commented on May 26, 2024

events. any thoughts on that style?

Sounds fine.

from node-archiver.

silverwind avatar silverwind commented on May 26, 2024

I came to like events more than callbacks too, so that's fine with me too.

I'd envision an event once the metadata from files in the archive is available, with an option to pause any further parsing to decide which files to extract.

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

once compress-commons gets off the ground this should be a possibility. still working on ideas using the java commons compress as an inspiration.

from node-archiver.

andrewrk avatar andrewrk commented on May 26, 2024

robust listing and unzipping support is available in this library: yauzl

See also: yazl

from node-archiver.

ctalkington avatar ctalkington commented on May 26, 2024

closing this as if i were to invest any time on such id contribute to yazl

from node-archiver.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.