Comments (7)
Could you include an example of code that does this? I'm curious how you're writing a response before reading from the body
from async-h1.
I don't have an example to hand, but I can produce one if that would help. I believe it should be fairly easy to reproduce:
- In the request handler, take the
Body
from the request and put it on the response. - That's it.
The server code will begin by encoding the response headers into the stream, and will then begin reading from the body. This will in turn trigger a read from the request body (since they are one and the same) which will result in the 100 Continue
message being written into the middle of the response.
from async-h1.
Send "100 Continue" before either a read from the request body, or when starting to write the response. Only omit it if the server explicitly drops the request body before beginning to send a response.
It seems the ordering might have been messed up here a bit. The design was intended so that the moment we start to read the request body, we write out 100: Continue
to the response.
But you're right to point out that if we start writing to the response stream, then read the body, 100: Continue
will be sent midway through writing the response. The fix seems to indeed also write 100: Continue
the moment the response starts writing, so either reading from the request or writing to the response triggers 100: Continue
.
from async-h1.
The fix seems to indeed also write 100: Continue the moment the response starts writing, so either reading from the request or writing to the response triggers 100: Continue.
Won't this result in the 100: Continue
always being sent?
FWIW, I have an experimental branch which solves several of these related issues (and fixes pipelining), I'll open a draft PR with that soon (there's one test failing right now).
from async-h1.
Won't this result in the 100: Continue always being sent?
Oh, I was thinking this would be done through two atomic bools (or equivalent).
- Detect whether a 100: continue needs to be sent when parsing the request headers and set a bool. This prevents it from being sent if not needed.
- if a continue needs to be sent, either on first read from request, or on first write to response flip the other bool and send the
100: Continue
. This prevents it from being sent twice.
FWIW, I have an experimental branch which solves several of these related issues (and fixes pipelining), I'll open a draft PR with that soon (there's one test failing right now).
Ohhh, neat!
from async-h1.
Oh, I was thinking this would be done through two atomic bools (or equivalent).
Sorry I didn't mean always sent, rather, even if the "Expect" header is sent, there is an optimization that if the body is never read, we don't send the continue message at all. If we send the continue message on the first write, then that optimization won't ever kick in.
In my branch, I prevent this by having further writes to the stream block until a decision has been made about whether to send the "100 Continue" or not (if you drop the Body without reading it, then it will not send it, if you start reading from it, it will send it). However, this only an improvement: misuse (by attempting to write before doing either of those things) will result in a deadlock rather than a corrupted response,
One possible solution other than those listed above would be to transparently buffer the writes in memory until the request body is either read or dropped. I'm leaning towards this one right now.
from async-h1.
there's one test failing right now
This seems to be caused by the ChunkedDecoder
sometimes reading past the end of the request. It should probably have a BufRead
bound instead of Read
so that it can avoid this.
from async-h1.
Related Issues (20)
- gracefully fail HTTP/1.0 connections HOT 2
- Fix Wasm build HOT 5
- Cache `Date` timestamps up to the minute HOT 2
- body is interpreted as next request if handler doesn't read it HOT 1
- Investigate the `http-desync-guardian` crate
- Suggestion to remove Clone bound from reader HOT 7
- Parsing Url is one of the most expensive things that async-h1 does HOT 3
- CloseableCursor implementation is racey HOT 1
- tcp_nodelay HOT 1
- Panic in both client and server: "Empty response" HOT 2
- Header DoS (surf#298)
- Panic in server when sending non-ASCII header value HOT 5
- Keep-Alive and persistent connection in client code
- Handle responses without Content-Length
- When acting as a client, processing HTTP/1.0 responses fails HOT 4
- Decode inserts malformed date header HOT 2
- Chunked decoding fails on a chunk of length 0xFF7 (4087), 9 bytes less than the buffer size HOT 2
- How to send head and body on single tcp packet? HOT 1
- Server response sending triggers Nagle's algorithm/delayed ACK interaction
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from async-h1.