Giter Site home page Giter Site logo

raw-body's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

raw-body's Issues

2.5.0 breaks koa bodyparser on startup

Hello.

The 2.5.0 release breaks the package co-body which is depended on by koa-bodyparser. The error is:

TypeError: Cannot read property 'split' of undefined at toIdentifier (/usr/src/app/node_modules/toidentifier/index.js:24:6) and the stack trace points to raw-body as the ultimate source. I added a resolution in package.json to force raw-body to 2.4.3 and the issue was resolved.

When the request connection is closed prematurely, the module never yields

I have been testing out the case where a request terminates before properly finishing. Using the following code to simluate this request:

var http = require('http');
var options = {
  host: 'localhost',
  path: '/'
};
var req = http.request(options, function(res) {
  res.on('data', function () {});
  res.on('end', function () {});
});

req.write('123');

setTimeout(function() {
  process.exit();
}, 200);

In this case the module never yields because only the close event triggers.

"Error: stream encoding should not be set"

Throwing this error from line 48 of index.js (raw-body version 1.1.1):

Error: stream encoding should not be set
    at makeError (/my/app/node_modules/express/node_modules/connect/node_modules/raw-body/index.js:132:15)
    at /my/app/node_modules/express/node_modules/connect/node_modules/raw-body/index.js:48:17
    at process._tickCallback (node.js:415:13)

Value of stream.encoding is undefined, rather than null, which it is expecting. Updated line to read the following and everything seems to work great:

if (state && state.encoding !== null && state.encoding != undefined)

check limit before decoding bytes

In this function:

  function onData (chunk) {
    if (complete) return

    received += chunk.length

    if (decoder) {
      buffer += decoder.write(chunk)
    } else {
      buffer.push(chunk)
    }

    if (limit !== null && received > limit) {
      done(createError(413, 'request entity too large', {
        limit: limit,
        received: received,
        type: 'entity.too.large'
      }))
    }
  }

We know if we are going to return an error when we set received; it seems like the error check would be better if it went before the write/push calls since we'd just throw away the written/pushed data in the event of too much data being read.

raw-body will hang when the param is a parsed body

Hi, thanks for your work on this repo.
I use raw-body and koa-joi-router which depends on raw-body in my project the same time.
I found that the body already parsed can not emit any listener in readStream function of raw-body even if the stream.readable is false.
It may be better if throw some error of just return the parsed body this situation.

Clarification of documentation about handling errors

In readme is stated:

If an error occurs, the stream will be paused, everything unpiped, and you are responsible for correctly disposing the stream. For HTTP requests, no handling is required if you send a response. For streams that use file descriptors, you should stream.destroy() or stream.close() to prevent leaks.

I would argue about the sentence "For HTTP requests, no handling is required if you send a response.". In this example it stops after first iteration:

let rawbody = require("raw-body");
let axios = require("axios");
let express = require("express");
let app = express();

app.put("/abc", function(req, res, next)
{
	rawbody(req, { limit: 1024 }, function(err)
	{
		if(err) return next(err);
		res.send("abc");
	});
});

app.use(function(err, req, res, next)
{
	res.status(413).send("too large");
});

(async function()
{
	axios.defaults.httpAgent = new (require("http").Agent)({ keepAlive: true });
	await new Promise(resolve => app.listen(8888, resolve));
	for(let i = 0; i < 10; i++)
	{
		console.log(i);
		let res = await axios.put("http://127.0.0.1:8888/abc", Buffer.alloc(1000000), { validateStatus: null });
		console.log(res.data);
	}
}());

Output:

0
too large
1

Note that express default "finalhandler" will read stream to end before sending response and indeed with default handler it does not stop after first iteration. So that is probably the correct way to handle error from "raw-body". Seems when connection is keep-alive then it is not enough to just send a response. Also I would be interested where such a thing is documented about nodejs http module?

Incorrect default encoding

I'm creating this issue again since you closed #50 and then locked it down, not letting me talk anymore. Sigh...

You said:

The encoding is passed to the iconv-lite module, not node.js core.

First, if you check their wiki at Supported-Encodings:

Node.js Native encodings: utf8, ucs2 / utf16le, ascii, binary, base64, hex

It's utf8, not utf-8. Basically because the encodings ARE passed to node.js after all.

Second, you can take a look at their tests to see that it's used utf8 everywhere.

Third, if it works with utf-8 it's just because its an ALIAS. All encodings are transformed with:

var enc = (''+encoding).toLowerCase().replace(/[^0-9a-z]|:\d{4}$/g, "");

Which, basically converts utf-8 into the CORRECT utf8.

So the default value of utf-8 in this library is MISLEADING because the correct ones, both in node and iconv-lite, are ------> utf8.

Since you locked the other issue, it seems that you are mad with the world or my contribution/messages are disturbing to you. So this is my last message, do whatever you want with your repo. Thanks. Happy new year.

Can't resolve 'async_hooks' in 'raw-body'

Even though I run the code on the server side, I wonder why there is a warning that can't find async_hook.

스크린샷 2022-11-01 오후 6 16 43

First of all, it was temporarily treated as below.

webpack: config => {
      config.plugins.push(
        new IgnorePlugin({
          resourceRegExp: /async_hooks/,
        })
      );

      return config;
    },

close versus end event

I have an issue with raw-body on the way it handles the 'close' event.

currently, if I

1. receive 'data'
2. receive 'close'
3. receive 'end'

then the buffer is empty because 'close' triggers a cleanup.

According to node.js documentation regarding the 'close' event, it is "Emitted when the underlying resource (for example, the backing file descriptor) has been closed. Not all streams will emit this."

to me, this means that the "underlying" ressource has been closed (a file descriptor for example). it does not mean that 'end' has already been called if there is still data in the internal stream buffer.

can we remove the handling of close altogether ?

Should `request size did not match content length` be considered an error in API

Hi Doug, thank you for all the awesome work with keeping these http tool-chains going. We really appreciate your efforts.

I'd like to know if the error condition for request size did not match content length should always be considered fatal enough to end the request. see:

done(createError(400, 'request size did not match content length', {

We've hit a situation in a system where the request size is expected to not match the content length, but raw-body may still be called on it. If I comment out the error condition in raw-body, we are able to get by the error and our request works / all tests go back to passing.

support encodings?

It's most performant to do string decoding when you read off the stream into a buffer instead of all in one go after bufferring. it may be worth accepting encodings here with iconv-lite instead of only the core encodings (yes i know iconv-lite can alter core to augment encodings, but idk, at least it's not a native module).

unable to retrieve https registry from our network

Greeting,

After 'connect' starts to require raw-body package, our build starts failing as we are unable to retrieve https repository.

Is that possible we can change the repository of raw-body to git://github.com/ instead of https:// ?

Cheers,

Roderick

customize how errors are done?

I want to depend on on raw-body but handle errors slightly differently.

Currently I handle errors like ( https://github.com/Raynos/body/blob/master/index.js#L5 )

The difference is that I have a statusCode property instead of status and each one of my errors has a type field.

One option is for me to just create new errors or wrap the errors raw-body passes up. Not sure how else to deal with differences.

Need to add a timeout option

It's all well and good to just wait an indefinite time for a stream to complete reading, but typically when dealing with HTTP (like this module), you probably need some kind of timeout. Right now, you can infact kind of do this if you do a bunch of work to wrap around a timer and manually emit an error event on req, for example, but this is lame.

I think there should be an optional timeout (defaults to Infinity like now). What I'm not sure of yet is how the timeout would specifically work: is the timeout a total time taken timer or an inactivity timer? Both have pros and cons.

Promise implementation as a option

Would like to be able to pass our own Promise implementation as an option over setting it as global.

Reasoning:

  • libraries using this library are forced to have to implement the same strategy and disclaimer to a user, as we shouldn't touch a user's global. But we may want to just support only Promises, as we can provide our implementation (but there's no way to do that without being able to pass it in as an options).
  • there exists a wide variety of different Promise libraries and implementation. The one that exists on global may not be the implementation a user or library writer wants to use and touching global may not be an option a user wants to do, and definitely not what a library writer using this lib wants to do

check `stream.headers` for options

the primary reason i didn't do this before because of content-encoding, but if we handle content-encoding we could just grab all necessary info from stream.headers, if available.

i'm sure they'll be edge cases if people do crazy shit...

Parse error

I want to test the length option but get Parse Error instead of the expected 400 error code. The lib cant capture the error and just throw it out.

rawBody(request, { limit: '1mb', encoding: 'utf8', length: 191 })

// the request:

request(app.listen())
    .post('/')
    .set('Content-Type', 'text/xml')
    .set('Content-Length', 191) // here is the length 
    .send(sourceXml) // the data's actual length is 197
    .expect(200, done);

and I get the error:

Error: Parse Error
      at Error (native)
      at Socket.socketOnData (_http_server.js:343:22)
      at Socket.emit (events.js:107:17)
      at readableAddChunk (_stream_readable.js:163:16)
      at Socket.Readable.push (_stream_readable.js:126:10)
      at TCP.onread (net.js:538:20)

AsyncResource never emits `destroy` hook

So to provide some context. I'm using Jest to write tests and I get bunch of warnings like below:

Jest has detected the following 3 open handles potentially keeping Jest from exiting:

  ●  bound-anonymous-fn

      79 |
      80 |       await request(app.getHttpServer())
    > 81 |         .post('/client-mapping')
         |          ^
      82 |         .set('Content-Type', 'application/json')
      83 |         .send(requestBody)
      84 |         .expect(200)

Jest has a feature to detect open handles and it uses async_hooks for that. Essentially, it registers the asyncId when it receives init hook and unregisters it gets destroy hook. If anything is still registered after the tests are run, then some async resource wasn't properly destroyed. Full implementation is here.

I've nailed down the warning to this library. Apparently, it creates the AsyncResource here, but it never calls .emitDestroy() and therefore never calls the destroy hook. According to the Node documentation, the hook must be called manually.

TypeError: stream.on is not a function

Hello,
I have this error, I am using Koa middleware.

TypeError: stream.on is not a function
    at readStream (/node_modules/raw-body/index.js:197:10)
    at executor (/node_modules/raw-body/index.js:113:5)
    at new Promise (<anonymous>)
    at getRawBody (/node_modules/raw-body/index.js:112:10)
    at /src/index.ts:99:30
    at step (/src/index.ts:33:23)
    at Object.next (/src/index.ts:14:53)
    at /src/index.ts:8:71
    at new Promise (<anonymous>)
    at __awaiter (/src/index.ts:4:12)
    at /src/index.ts:95:29
    at dispatch (/node_modules/koa-compose/index.js:42:32)
    at filter (/node_modules/koa-json/index.js:25:12)
    at dispatch (/node_modules/koa-compose/index.js:42:32)
    at cors (/node_modules/@koa/cors/index.js:59:38)
    at dispatch (/node_modules/koa-compose/index.js:42:32)

My code is :

var contentType = require('content-type')
var getRawBody = require('raw-body')

    app.use(async (ctx, next) => {

        if (ctx.request.originalUrl.startsWith('/raw-parse')) {

            ctx.text = await getRawBody(ctx.request, {
                length: ctx.request.headers['content-length'],
                limit: '1mb',
                encoding: contentType.parse(ctx.request).parameters.charset
            })
        }

        await next();
    })

Error: Cannot switch to old mode now.

I'm using https://github.com/balderdashy/skipper to upload files in a sails js application, which gave the error Error: Cannot switch to old mode now. After some digging, I found out that the error is fixed by commenting out line #33 in stream-utils/raw-body/index.js. It seems to be related to this issue: https://github.com/balderdashy/skipper/issues/10.

While I'm convinced this is an issue related to balderdashy/skipper specifically, I'm posting this here anyway - maybe you guys can shed some light on this issue.

Cheers, Fabian

expecting a buffer stream

You throw an expected a buffer stream error if an encoding was set after you have consumed the stream.

Would it be helpful to throw this error at the start aswell. i.e. check the encoding before consuming the stream.

Content-Type: text/xml Can not parse

demo code like this:

app.use(function* (next) {
  var string = yield getRawBody(this.req, {
    length: this.length,
    limit: '1mb',
    encoding: this.charset
  })
  this.body = string;
});

when I request using curl

curl -d 'xxxxxxxxxxxxxxx' 'http://localhost:80/' -H 'Content-Type: application/xml'

It's ok. But when I change the Content-Type to text/xml. It can not be parsed.

curl -d 'xxxxxxxxxxxxxxx' 'http://localhost:80/' -H 'Content-Type: text/xml'

dumping stream for 413

You dump the stream at the start if content-length > limit but don't dump the stream if the limit is hit in the data handler.

In both cases you return a 413.

Why do you dump the stream ? Shouldn't the application level error handler call req.close() ?

If the stream has already emitted its end event, raw-body never yeilds

getRawBody(res, function(err, string) {
    getRawBody(res, function(err, string) {
        console.log('never called')
    })
}

Or, more realistically, you're using app.use(bodyParser) and a library like json-api which both try to run getRawBody.

Not sure if it's raw-body's job to check the stream hasn't already ended, or the user's job to make sure he's not passing getRawBody an expired stream.

Flowing streams

So the question arises from #23 of whether or not this module should accept a "flowing stream" or not. Node.js core defines a flowing stream as one that has been piped. Right now this will work as long as there are no errors, otherwise it will explode.

I'm leaning on saying we should reject flowing streams.

Example of passing a flowing stream:

req.pipe(my_stream) // makes it "flowing"
getRawBody(req, 'utf8', function (err, body) {
})

x-www-form-urlencoded request resulting in Request Aborted

Hi there,

I have probably something not configured correctly, but I have researched and googled this issue for a couple days now and haven't made any head way, any help would be hugely helpful.

When I send an x-www-form-urlencoded request with a couple parameters, the request seems to timeout resulting in the onAborted stream listener to trigger (after a 120 second timeout).

My request is a "POST" that I can send with postman to reproduce.

image

I have added some logging inside node_modules/raw-body/index.js to try to track down whats actually happening inside readStream function.

It appears that none of the listeners are ever triggering (none of these functions are called, except for onAborted):

  // attach listeners
  stream.on("aborted", onAborted);
  stream.on("close", cleanup);
  stream.on("data", onData);
  stream.on("readable", (chunk) => {
    console.log("data can be read", chunk);
  });
  stream.on("end", onEnd);
  stream.on("error", onEnd);

After I send the request, I can only onAborted get triggered, and this is after 120 seconds (on our dev server this is 60 seconds) so I believe this is the result of a timeout triggering.

After logging the stream, I can see that the body is actually being parsed completely and correctly:

  files: null,
  read: [Function],
  body: 
   { bt_signature: 'bn3gnsr7qw534sgt%26cxqbt32bycvftqkp%7Cd3180e35f92dd5e9b772cf330f6076fa5d31d235',
     bt_payload: 'PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz4KPG5vdGlm%0AaWNhdGlvbj4KICA8a2luZD5jaGVjazwva2luZD4KICA8dGltZXN0YW1wIHR5%0AcGU9ImRhdGV0aW1lIj4yMDE2LTEyLTE0VDE1OjUxOjM1WjwvdGltZXN0YW1w%0APgogIDxzdWJqZWN0PgogICAgPGNoZWNrIHR5cGU9ImJvb2xlYW4iPnRydWU8%0AL2NoZWNrPgogIDwvc3ViamVjdD4KPC9ub3RpZmljYXRpb24%2BCg%3D%3D%0A' },
  _body: true,
  length: undefined }

The readableState of the stream looks like this:

      ReadableState {
        objectMode: false,
        highWaterMark: 16384,
        buffer: [Object],
        length: 0,
        pipes: null,
        pipesCount: 0,
        flowing: true,
        ended: false,
        endEmitted: false,
        reading: true,
        sync: false,
        needReadable: true,
        emittedReadable: false,
        readableListening: false,
        resumeScheduled: false,
        defaultEncoding: 'utf8',
        ranOut: false,
        awaitDrain: 0,
        readingMore: false,
        decoder: null,
        encoding: null },

I've tried a few different versions of Node but all with the same behavior, (v6.2.2 and v7.2.1).

I'm not quite sure what else to try, or what other information I should provide. Do you have any thoughts on why this is happening?

Thank you,
Doug

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.