stream-utils / raw-body Goto Github PK
View Code? Open in Web Editor NEWGet and validate the raw body of a readable stream
License: MIT License
Get and validate the raw body of a readable stream
License: MIT License
While looking at the dependency graph of express I saw two http-error
package
https://npm.anvaka.com/#/view/2d/express
[email protected]
[email protected]
Any reason for not using tilde(~) or caret(^) in package.json?
latest version is 1.7.0
maybe you would like to consider using greenkeeper to keep your dependencies up to date?
See issue Raynos/body#2
If you want to consume the body as a string then using StringDecoder
is better then an array of buffers.
Hello.
The 2.5.0
release breaks the package co-body
which is depended on by koa-bodyparser
. The error is:
TypeError: Cannot read property 'split' of undefined at toIdentifier (/usr/src/app/node_modules/toidentifier/index.js:24:6)
and the stack trace points to raw-body
as the ultimate source. I added a resolution
in package.json
to force raw-body
to 2.4.3
and the issue was resolved.
I have been testing out the case where a request terminates before properly finishing. Using the following code to simluate this request:
var http = require('http');
var options = {
host: 'localhost',
path: '/'
};
var req = http.request(options, function(res) {
res.on('data', function () {});
res.on('end', function () {});
});
req.write('123');
setTimeout(function() {
process.exit();
}, 200);
In this case the module never yields because only the close
event triggers.
You check for stream._readableState
. I still want to support 0.8
Throwing this error from line 48 of index.js (raw-body version 1.1.1):
Error: stream encoding should not be set at makeError (/my/app/node_modules/express/node_modules/connect/node_modules/raw-body/index.js:132:15) at /my/app/node_modules/express/node_modules/connect/node_modules/raw-body/index.js:48:17 at process._tickCallback (node.js:415:13)
Value of stream.encoding is undefined, rather than null, which it is expecting. Updated line to read the following and everything seems to work great:
if (state && state.encoding !== null && state.encoding != undefined)
In this function:
function onData (chunk) {
if (complete) return
received += chunk.length
if (decoder) {
buffer += decoder.write(chunk)
} else {
buffer.push(chunk)
}
if (limit !== null && received > limit) {
done(createError(413, 'request entity too large', {
limit: limit,
received: received,
type: 'entity.too.large'
}))
}
}
We know if we are going to return an error when we set received
; it seems like the error check would be better if it went before the write
/push
calls since we'd just throw away the written/pushed data in the event of too much data being read.
Hi, thanks for your work on this repo.
I use raw-body
and koa-joi-router
which depends on raw-body
in my project the same time.
I found that the body already parsed can not emit any listener in readStream function of raw-body even if the stream.readable
is false
.
It may be better if throw some error of just return the parsed body this situation.
In readme is stated:
If an error occurs, the stream will be paused, everything unpiped, and you are responsible for correctly disposing the stream. For HTTP requests, no handling is required if you send a response. For streams that use file descriptors, you should stream.destroy() or stream.close() to prevent leaks.
I would argue about the sentence "For HTTP requests, no handling is required if you send a response.". In this example it stops after first iteration:
let rawbody = require("raw-body");
let axios = require("axios");
let express = require("express");
let app = express();
app.put("/abc", function(req, res, next)
{
rawbody(req, { limit: 1024 }, function(err)
{
if(err) return next(err);
res.send("abc");
});
});
app.use(function(err, req, res, next)
{
res.status(413).send("too large");
});
(async function()
{
axios.defaults.httpAgent = new (require("http").Agent)({ keepAlive: true });
await new Promise(resolve => app.listen(8888, resolve));
for(let i = 0; i < 10; i++)
{
console.log(i);
let res = await axios.put("http://127.0.0.1:8888/abc", Buffer.alloc(1000000), { validateStatus: null });
console.log(res.data);
}
}());
Output:
0
too large
1
Note that express default "finalhandler" will read stream to end before sending response and indeed with default handler it does not stop after first iteration. So that is probably the correct way to handle error from "raw-body". Seems when connection is keep-alive then it is not enough to just send a response. Also I would be interested where such a thing is documented about nodejs http module?
I'm creating this issue again since you closed #50 and then locked it down, not letting me talk anymore. Sigh...
You said:
The encoding is passed to the iconv-lite module, not node.js core.
First, if you check their wiki at Supported-Encodings:
Node.js Native encodings: utf8, ucs2 / utf16le, ascii, binary, base64, hex
It's utf8
, not utf-8
. Basically because the encodings ARE passed to node.js after all.
Second, you can take a look at their tests to see that it's used utf8
everywhere.
Third, if it works with utf-8
it's just because its an ALIAS. All encodings are transformed with:
var enc = (''+encoding).toLowerCase().replace(/[^0-9a-z]|:\d{4}$/g, "");
Which, basically converts utf-8
into the CORRECT utf8
.
So the default value of utf-8
in this library is MISLEADING because the correct ones, both in node and iconv-lite, are ------> utf8
.
Since you locked the other issue, it seems that you are mad with the world or my contribution/messages are disturbing to you. So this is my last message, do whatever you want with your repo. Thanks. Happy new year.
We were unable to find how the aborted event was being raised. According to https://nodejs.org/api/stream.html no such event exists. I also couldn't find any other reference to it in the code base.
Everytime we POST data, raw-body just hangs and eventually times out with an aborted error. Even if we're posting a very minimal amount of data.
return defer
function defer(fn) {
done = fn
}
need to check the limit on both the compressed body and the resulting uncompressed body.
I have an issue with raw-body on the way it handles the 'close' event.
currently, if I
1. receive 'data'
2. receive 'close'
3. receive 'end'
then the buffer is empty because 'close' triggers a cleanup.
According to node.js documentation regarding the 'close' event, it is "Emitted when the underlying resource (for example, the backing file descriptor) has been closed. Not all streams will emit this."
to me, this means that the "underlying" ressource has been closed (a file descriptor for example). it does not mean that 'end' has already been called if there is still data in the internal stream buffer.
can we remove the handling of close altogether ?
Hi Doug, thank you for all the awesome work with keeping these http tool-chains going. We really appreciate your efforts.
I'd like to know if the error condition for request size did not match content length
should always be considered fatal enough to end the request. see:
Line 263 in afcb732
We've hit a situation in a system where the request size is expected to not match the content length, but raw-body
may still be called on it. If I comment out the error condition in raw-body
, we are able to get by the error and our request works / all tests go back to passing.
It's most performant to do string decoding when you read off the stream into a buffer instead of all in one go after bufferring. it may be worth accepting encodings here with iconv-lite
instead of only the core encodings (yes i know iconv-lite
can alter core to augment encodings, but idk, at least it's not a native module).
The master branch contains necessary changes for Node v10, it would be great to have a new version released on NPM.
Greeting,
After 'connect' starts to require raw-body package, our build starts failing as we are unable to retrieve https repository.
Is that possible we can change the repository of raw-body to git://github.com/ instead of https:// ?
Cheers,
Roderick
I want to depend on on raw-body
but handle errors slightly differently.
Currently I handle errors like ( https://github.com/Raynos/body/blob/master/index.js#L5 )
The difference is that I have a statusCode
property instead of status
and each one of my errors has a type
field.
One option is for me to just create new errors or wrap the errors raw-body
passes up. Not sure how else to deal with differences.
It's all well and good to just wait an indefinite time for a stream to complete reading, but typically when dealing with HTTP (like this module), you probably need some kind of timeout. Right now, you can infact kind of do this if you do a bunch of work to wrap around a timer and manually emit an error event on req
, for example, but this is lame.
I think there should be an optional timeout (defaults to Infinity like now). What I'm not sure of yet is how the timeout would specifically work: is the timeout a total time taken timer or an inactivity timer? Both have pros and cons.
Would like to be able to pass our own Promise implementation as an option over setting it as global.
Reasoning:
the primary reason i didn't do this before because of content-encoding
, but if we handle content-encoding
we could just grab all necessary info from stream.headers
, if available.
i'm sure they'll be edge cases if people do crazy shit...
Koa uses async/await now instead of generators.
I want to test the length
option but get Parse Error
instead of the expected 400
error code. The lib cant capture the error and just throw it out.
rawBody(request, { limit: '1mb', encoding: 'utf8', length: 191 })
// the request:
request(app.listen())
.post('/')
.set('Content-Type', 'text/xml')
.set('Content-Length', 191) // here is the length
.send(sourceXml) // the data's actual length is 197
.expect(200, done);
and I get the error:
Error: Parse Error
at Error (native)
at Socket.socketOnData (_http_server.js:343:22)
at Socket.emit (events.js:107:17)
at readableAddChunk (_stream_readable.js:163:16)
at Socket.Readable.push (_stream_readable.js:126:10)
at TCP.onread (net.js:538:20)
So to provide some context. I'm using Jest to write tests and I get bunch of warnings like below:
Jest has detected the following 3 open handles potentially keeping Jest from exiting:
● bound-anonymous-fn
79 |
80 | await request(app.getHttpServer())
> 81 | .post('/client-mapping')
| ^
82 | .set('Content-Type', 'application/json')
83 | .send(requestBody)
84 | .expect(200)
Jest has a feature to detect open handles and it uses async_hooks
for that. Essentially, it registers the asyncId when it receives init
hook and unregisters it gets destroy
hook. If anything is still registered after the tests are run, then some async resource wasn't properly destroyed. Full implementation is here.
I've nailed down the warning to this library. Apparently, it creates the AsyncResource
here, but it never calls .emitDestroy()
and therefore never calls the destroy
hook. According to the Node documentation, the hook must be called manually.
Hello,
I have this error, I am using Koa middleware.
TypeError: stream.on is not a function
at readStream (/node_modules/raw-body/index.js:197:10)
at executor (/node_modules/raw-body/index.js:113:5)
at new Promise (<anonymous>)
at getRawBody (/node_modules/raw-body/index.js:112:10)
at /src/index.ts:99:30
at step (/src/index.ts:33:23)
at Object.next (/src/index.ts:14:53)
at /src/index.ts:8:71
at new Promise (<anonymous>)
at __awaiter (/src/index.ts:4:12)
at /src/index.ts:95:29
at dispatch (/node_modules/koa-compose/index.js:42:32)
at filter (/node_modules/koa-json/index.js:25:12)
at dispatch (/node_modules/koa-compose/index.js:42:32)
at cors (/node_modules/@koa/cors/index.js:59:38)
at dispatch (/node_modules/koa-compose/index.js:42:32)
My code is :
var contentType = require('content-type')
var getRawBody = require('raw-body')
app.use(async (ctx, next) => {
if (ctx.request.originalUrl.startsWith('/raw-parse')) {
ctx.text = await getRawBody(ctx.request, {
length: ctx.request.headers['content-length'],
limit: '1mb',
encoding: contentType.parse(ctx.request).parameters.charset
})
}
await next();
})
I'm using https://github.com/balderdashy/skipper to upload files in a sails js application, which gave the error Error: Cannot switch to old mode now
. After some digging, I found out that the error is fixed by commenting out line #33
in stream-utils/raw-body/index.js
. It seems to be related to this issue: https://github.com/balderdashy/skipper/issues/10.
While I'm convinced this is an issue related to balderdashy/skipper
specifically, I'm posting this here anyway - maybe you guys can shed some light on this issue.
Cheers, Fabian
You throw an expected a buffer stream error if an encoding was set after you have consumed the stream.
Would it be helpful to throw this error at the start aswell. i.e. check the encoding before consuming the stream.
demo code like this:
app.use(function* (next) {
var string = yield getRawBody(this.req, {
length: this.length,
limit: '1mb',
encoding: this.charset
})
this.body = string;
});
when I request using curl
curl -d 'xxxxxxxxxxxxxxx' 'http://localhost:80/' -H 'Content-Type: application/xml'
It's ok. But when I change the Content-Type to text/xml. It can not be parsed.
curl -d 'xxxxxxxxxxxxxxx' 'http://localhost:80/' -H 'Content-Type: text/xml'
You dump the stream at the start if content-length > limit but don't dump the stream if the limit is hit in the data handler.
In both cases you return a 413.
Why do you dump the stream ? Shouldn't the application level error handler call req.close()
?
getRawBody(res, function(err, string) {
getRawBody(res, function(err, string) {
console.log('never called')
})
}
Or, more realistically, you're using app.use(bodyParser)
and a library like json-api which both try to run getRawBody
.
Not sure if it's raw-body
's job to check the stream hasn't already ended, or the user's job to make sure he's not passing getRawBody
an expired stream.
So the question arises from #23 of whether or not this module should accept a "flowing stream" or not. Node.js core defines a flowing stream as one that has been piped. Right now this will work as long as there are no errors, otherwise it will explode.
I'm leaning on saying we should reject flowing streams.
Example of passing a flowing stream:
req.pipe(my_stream) // makes it "flowing"
getRawBody(req, 'utf8', function (err, body) {
})
Hi there,
I have probably something not configured correctly, but I have researched and googled this issue for a couple days now and haven't made any head way, any help would be hugely helpful.
When I send an x-www-form-urlencoded request with a couple parameters, the request seems to timeout resulting in the onAborted stream listener to trigger (after a 120 second timeout).
My request is a "POST" that I can send with postman to reproduce.
I have added some logging inside node_modules/raw-body/index.js
to try to track down whats actually happening inside readStream
function.
It appears that none of the listeners are ever triggering (none of these functions are called, except for onAborted):
// attach listeners
stream.on("aborted", onAborted);
stream.on("close", cleanup);
stream.on("data", onData);
stream.on("readable", (chunk) => {
console.log("data can be read", chunk);
});
stream.on("end", onEnd);
stream.on("error", onEnd);
After I send the request, I can only onAborted get triggered, and this is after 120 seconds (on our dev server this is 60 seconds) so I believe this is the result of a timeout triggering.
After logging the stream, I can see that the body is actually being parsed completely and correctly:
files: null,
read: [Function],
body:
{ bt_signature: 'bn3gnsr7qw534sgt%26cxqbt32bycvftqkp%7Cd3180e35f92dd5e9b772cf330f6076fa5d31d235',
bt_payload: 'PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz4KPG5vdGlm%0AaWNhdGlvbj4KICA8a2luZD5jaGVjazwva2luZD4KICA8dGltZXN0YW1wIHR5%0AcGU9ImRhdGV0aW1lIj4yMDE2LTEyLTE0VDE1OjUxOjM1WjwvdGltZXN0YW1w%0APgogIDxzdWJqZWN0PgogICAgPGNoZWNrIHR5cGU9ImJvb2xlYW4iPnRydWU8%0AL2NoZWNrPgogIDwvc3ViamVjdD4KPC9ub3RpZmljYXRpb24%2BCg%3D%3D%0A' },
_body: true,
length: undefined }
The readableState of the stream looks like this:
ReadableState {
objectMode: false,
highWaterMark: 16384,
buffer: [Object],
length: 0,
pipes: null,
pipesCount: 0,
flowing: true,
ended: false,
endEmitted: false,
reading: true,
sync: false,
needReadable: true,
emittedReadable: false,
readableListening: false,
resumeScheduled: false,
defaultEncoding: 'utf8',
ranOut: false,
awaitDrain: 0,
readingMore: false,
decoder: null,
encoding: null },
I've tried a few different versions of Node but all with the same behavior, (v6.2.2 and v7.2.1).
I'm not quite sure what else to try, or what other information I should provide. Do you have any thoughts on why this is happening?
Thank you,
Doug
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.