Giter Site home page Giter Site logo

buzut / huge-uploader Goto Github PK

View Code? Open in Web Editor NEW
102.0 4.0 21.0 133 KB

JavaScript module to handle huge file uploads by chunking them in the browser. Resumable, fault tolerent, offline aware, mobile ready.

License: BSD 3-Clause "New" or "Revised" License

JavaScript 100.00%
upload chunked-uploads javascript-module

huge-uploader's Introduction

huge uploader

huge-uploader is a JavaScript module designed to handle huge file uploads by chunking them in the browser. Uploads are resumable, fault tolerent, offline aware and mobile ready.

HTTP and especially HTTP servers have limits and were not designed to transfer large files. In addition, network connection can be unreliable. No one wants an upload to fail after hours… Sometimes we even need to pause the upload, and HTTP doesn't allow that.

The best way to circumvent these issues is to chunk the file and send it in small pieces. If a chunk fails, no worries, it's small and fast to re-send it. Wanna pause? Ok, just start where you left off when ready.

That's what huge-uploader does. It:

  • chunks the file in pieces of your chosen size,
  • retries to upload a given chunk when transfer failed,
  • auto pauses transfer when device is offline and resumes it when back online,
  • allows you to pause and resume the upload,
  • obviously allows you to set custom headers and post parameters.

Installation & usage

npm install huge-uploader --save
import HugeUploader from 'huge-uploader';

// instanciate the module with a settings object
const uploader = new HugeUploader({ endpoint: '//where-to-send-files.com/upload/', file: fileObject });

// subscribe to events
uploader.on('error', (err) => {
    console.error('Something bad happened', err.detail);
});

uploader.on('progress', (progress) => {
    console.log(`The upload is at ${progress.detail}%`);
});

uploader.on('finish', (body) => {
    console.log('yeahhh - last response body:', body.detail);
});

// if you want to pause/resume the upload
uploader.togglePause();

Constructor settings object

The constructor takes a settings object. Available options are:

  • endpoint { String } – where to send the chunks (required)
  • file { Object } – a File object representing the file to upload (required)
  • headers { Object } – custom headers to send with each request
  • postParams { Object } – post parameters that will be sent with the last chunk
  • chunkSize { Number } – size of each chunk in MB (default is 10MB)
  • retries { Number } – number of total retries (total, not per chunk) after which upload fails (default is 5)
  • delayBeforeRetry { Number } – how long to wait (in seconds) after a failure before next try (default is 5s)

Events

Events handling informations are instances of the CustomEvent constructor. Hence, message is available in the detail property.

error

Either server responds with an error code that isn't going to change. Success response codes are 200, 201, 204. All error codes apart from 408, 502, 503, 504 are considered not susceptible to change with a retry.

Or there were too many retries already.

uploader.on('error', err => console.log(err.detail)); // A string explaining the error

fileRetry

uploader.on('fileRetry', (msg) => {
    /** msg.detail is an object like:
    * {
    * 	  message: 'An error occured uploading chunk 243. 6 retries left',
    *     chunk: 243,
    *     retriesLeft: 6
    * }
    */
});

progress

uploader.on(progress, progress => console.log(progress.detail)); // Number between 0 and 100

finish

The finish event is triggered with the last response body attached.

uploader.on('finish', body => console.log('🍾'));

offline

Notifies that browser is offline, hence the uploader paused itself. Nevertheless, it's paused internally, it has nothing to do with paused triggered with .togglePause() method nor does it interact with user pause state.

uploader.on('offline', () => console.log('no problem, wait and see…'));

online

Notifies that browser is back online and uploader is going to resume the upload (if not paused by .togglePause()).

uploader.on('online', () => console.log('😎'));

Method

There is only one method: .togglePause(). As its name implies, it pauses and resumes the upload. If you need to abort an upload, you can use this method to stop it and then destroy the instance's variable.

How to set up with the server

This module has a twin Node.js module to handle uploads with a Node.js server as a backend. Neverthless it's easy to implement the server side in your preferred language (if you develop a module, tell me about it so I can add it to this README).

Files are sent with POST requests containing the following headers:

  • uploader-file-id unique file id based on file size, upload time and a random generated number (so it's really unique),
  • uploader-chunks-totalthe total numbers of chunk that will be sent,
  • uploader-chunk-number the current chunk number (0 based index, so last chunk is uploader-chunks-total - 1).

POST parameters are sent with the last chunk if any (as set in constructor's options object).

The typical server implementation is to create a directory (name it after uploader-file-id) when chunk 0 is received and write all chunks into it. When last chunk is received, grab the POST parameters if any, concatenate all the files into a single file and remove the temporary directory.

Also, don't forget that you might never receive the last chunk if upload is abandoned, so don't forget to clean your upload directory from time to time.

In case you are sending to another domain or subdomain than the current site, you'll have to setup CORS accordingly. That is, set the following CORS headers:

  • Access-Control-Allow-Origin: https://origin-domain.com (here you can set a wildcard or the domain from whitch you upload the file,
  • Access-Control-Allow-Methods: POST,OPTIONS,
  • Access-Control-Allow-Headers: uploader-chunk-number,uploader-chunks-total,uploader-file-id,
  • Access-Control-Max-Age: 86400.

These parameters tell your browser that it can use OPTIONS (the preflight request) and POST methods on the target domain and that the custom headers are allowed to be sent. The last header tells the browser than it can cache the result of the preflight request (here for 24hrs) so that it doesn't need to re-send a preflight before each POST request.

Under the hood

The library works around the HTML5 File API, the rather new Fetch API and the new EventTarget constructor.

EventTarget constructor is polyfilled so it won't be a problem. Nevertheless, your target browsers have to support HTML5 File API and Fetch API. It means that all browsers in their recent versions apart from Internet Explorer can run this without a problem.

You can polyfill Fetch if you want to support IE.

Contributing

There's sure room for improvement, so feel free to hack around and submit PRs! Please just follow the style of the existing code, which is Airbnb's style with minor modifications.

To maintain things clear and visual, please follow the git commit template.

huge-uploader's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

huge-uploader's Issues

Headers included in byte array at server

My server-side implementation receives the chunks of a ZIP as byte arrays:

@POST
@Path("/uploadThing")
@Consumes(MediaType.MULTIPART_FORM_DATA)
@Produces(MediaType.APPLICATION_JSON)
public ThingResponse uploadThing(final @Context() ContainerRequestContext p_Context, byte[] p_RawData) {
    return new ThingResponse(p_Context, p_RawData);
}

I concatenate all the chunks to a single byte array and save it against a business object in an Oracle table.

When I download the ZIP using this code and MediaType.APPLICATION_OCTET_STREAM:

BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(p_OutputStream);
bufferedOutputStream.write(thingBO.getRawData());

then the resulting ZIP cannot be opened. Looking at it in Notepad++, the new ZIP is almost exactly the same as the original, except that at the top it has this (before the first "PK" of the ZIP bytes):

------WebKitFormBoundaryXSHBfDH8JAH20815
Content-Disposition: form-data; name="file"; filename="blob"
Content-Type: application/octet-stream

If I remove these 3 lines and save then I can open the ZIP fine and it exactly matches the original.

How can I stop it from adding these 3 lines?

They were not added when I uploaded the ZIP file directly, without your code, but some ZIP files are too big so I do need it. (I was doing a JSON post with the byte[] as one property of an object parameter.)

Many thanks.

Unable to process blobs when using Apple Quicktime screen recorded videos

Hi,

On an apple device

  1. open QuickTime Payer app
  2. then click File -> Screen recording
  3. record your screen for a few seconds, then export with default settings

You should now have a .avi or a .mov file.

When choosing this file for upload, the HugeUploader fails to process the first chunk and immediately enters a retry phase. This repeats until retries are exhausted.

I believe the problem is with the filename not the actual file itself, as I was able to upload it after renaming it to something else.

Please fix.

How do I use this?

(Yes, I'm stupid)
This looks like a npm package but I don't know how to use them in browsers. How do I use this? Also, I would appreciate if you add more details to README so then there would be no more people like me.

Problem on IE11

Even when polyfilling Fetch and Promise, we can't get this to work on IE11.

We get the same issue as reported here, due to the dependency on event-target-shim:

mysticatea/event-target-shim#30

Maybe the dependency (3.0.1) is now too old to be supported?

Edited: we have fixed this by babel polyfilling node_modules/event-target-shim itself

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.