Comments (18)
In order to compress requests, we have to know the client (browser) can actually decompress them. If the browser wants compressed requests, then it will include the request header Accept-Encoding: gzip
to indicate it is willing to accept gzip'd responses.
Without that header, this module will not compress the response and you'll get that debug message no compression: not acceptable
.
from compression.
I just Google'd "Firefox Accept-Encoding" and saw that if your Firefox is not including that header, then there is something wrong with your Firefox profile (I don't have any issues with Firefox). You may need to reset your Firefox profile to research into what you need to do to fix your Firefox profile, but it's beyond the scope here, unfortunately.
from compression.
It does contain it, and is ignored. Here are the headers:
Firefox Request:
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:37.0) Gecko/20100101 Firefox/37.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: hu,en-GB;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate
DNT: 1
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Firefox Response:
Accept-Ranges: bytes
Cache-Control: public, max-age=0
Content-Disposition: attachment; filename="test_results.json"
Content-Length: 16848
Content-Type: application/json
Date: Sun, 19 Apr 2015 15:29:52 GMT
Etag: W/"41d0-4181526835"
Last-Modified: Sun, 19 Apr 2015 08:57:36 GMT
Strict-Transport-Security: max-age=16070400; includeSubdomains
Vary: Accept-Encoding
X-Firefox-Spdy: 3.1
X-Frame-Options: DENY
X-XSS-Protection: 1; mode=block
x-content-type-options: nosniff
x-download-options: noopen
Chrome Request:
accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
accept-encoding:gzip, deflate, sdch
accept-language:hu-HU,hu;q=0.8,en-US;q=0.6,en;q=0.4
cache-control:no-cache
dnt:1
pragma:no-cache
user-agent:Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.90 Safari/537.36
Chrome Response:
accept-ranges:bytes
cache-control:public, max-age=0
content-disposition:attachment; filename="test_results.json"
content-encoding:gzip
content-type:application/json
date:Sun, 19 Apr 2015 15:32:11 GMT
etag:W/"41d0-4181526835"
last-modified:Sun, 19 Apr 2015 08:57:36 GMT
status:200 OK
strict-transport-security:max-age=16070400; includeSubdomains
vary:Accept-Encoding
version:HTTP/1.1
x-content-type-options:nosniff
x-download-options:noopen
x-frame-options:DENY
x-xss-protection:1; mode=block
As I have already said, the only DIFFERENCE is the presence of Connection: keep-alive
in the Firefox request, but that should not cause problems.
There is nothing wrong with my Firefox Profile. The behavior is the same with a fresh install of Firefox Developer Edition.
from compression.
Gotcha. I have Firefox and it works just fine. Without being able to reproduce it locally, there isn't much I can say. Can you please dig in and propose a fix?
from compression.
I'm using Firefox 37.0.1 by the way, connecting directly to a Node.js server on localhost, as to avoid any alteration of requests/responses by intermediate proxies.
from compression.
I am on Windows 7 x64 btw. In theory, the headers should be the only thing that matters, no?
Maybe I should include compression middleware later? Currently it's the first one. If it wasn't working in Chrome, I would think it's my fault, but like this, I have no idea, what could be the problem.
Also, my Firefox can handle compression, it does work on the nginx server, just not on the node one.
from compression.
Also, it may be able to help me if you can post a console.dir(req.headers)
from those two different headers, so I can see what Node.js sees as the request headers. You can probably add it anywhere, but simply putting it here would be the best: https://github.com/expressjs/compression/blob/1.4.3/index.js#L163
I am on Windows 7 x64 btw.
I'm also using Windows 7 x64.
In theory, the headers should be the only thing that matters, no?
Yes, the request headers are all that matters, at least for that debug message you sent (we do also consider the response type and size). The above console.dir
will help out a lot here.
Currently it's the first one.
This is fine. Basically it'll compress all requests going through the middleware, so having it first just means everything will be considered for compression.
from compression.
Firefox:
{ host: 'localhost',
'user-agent': 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:37.0) Gecko/20100101 Firefox/37.0',
accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'accept-language': 'hu,en-GB;q=0.7,en;q=0.3',
dnt: '1',
'cache-control': 'no-cache',
pragma: 'no-cache' }
Chrome:
{ host: 'localhost',
accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8',
'accept-encoding': 'gzip, deflate, sdch',
'accept-language': 'hu-HU,hu;q=0.8,en-US;q=0.6,en;q=0.4',
'cache-control': 'no-cache',
dnt: '1',
pragma: 'no-cache',
'user-agent': 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chr
ome/42.0.2311.90 Safari/537.36' }
Ok, so clearly accept-encoding
disappears somewhere along the way, even tho Firefox has sent it.
from compression.
Ok, so clearly accept-encoding disappears somewhere along the way, even tho Firefox has sent it.
Right, and so this module would never compress it. There are three main reasons why this would happen:
- An intermediate proxy is removing it (this is the most common problem and is reported periodically). To determine this, you need to run your app on your localhost and connect to it to remove that possibility.
- Something in your Node.js app is altering
req.headers
. You can determine this by adding aconsole.log(req.headers)
here: https://github.com/joyent/node/blob/v0.10.26/lib/http.js#L2097 - The browser is lying to you. You can use something like Fiddler to see the headers the browser is sending from an independent entity.
from compression.
I am not using any proxy, I am connecting directly on localhost.
Even if my first middleware is just header printing, the Accept-Encoding
header is already gone. Could spdy
module be the culprit? That's the only one that gets access to headers beside express, before Accept-Encoding
is already gone.
Also noticed, that Chrome says accept-encoding
while Firefox says Accept-Encoding
, but that should not be a problem either.
I can't imagine Firefox lying about this.
from compression.
Could spdy module be the culprit?
Maybe, I'm not sure. But there is nothing I can change in this module since the header is not there for us to read. Please keep digging and let me know what you find. If you can provide me an app.js
file that I can copy and paste on run on my machine to reproduce, I can dig around as well.
Also noticed, that Chrome says accept-encoding while Firefox says Accept-Encoding, but that should not be a problem either.
Yes, that's no an issue. Header are not case-sensitive and Node.js actually normalizes them to all lower-case, which is why you always see them in lower-case in Node.js.
I can't imagine Firefox lying about this.
You'd be surprised; I suggested that because it has literally happened before.
from compression.
Bingo, disabling spdy fixes the disappearing header problem. Firefox could not lie about this, since it does work with nginx. Now, how do I debug that? :)
from compression.
Firefox could not lie about this, since it does work with nginx.
Ah, I see. This is really starting to sound like a bug in the spdy
module to me (it's dropping a header?). I don't really know anything about that module. Perhaps open an issue over there and link back to this issue for context?
from compression.
Ok, Issue raised at spdy: spdy-http2/node-spdy#199
(Firefox should probably be included in the issue title, since it all works fine with Chrome.)
from compression.
Cool. FYI from Apache's issue tracker, the answer is Firefox is actually lying to you: https://issues.apache.org/jira/browse/TS-3026 Firefox does not send the Accept-Encoding
header over SPDY connections, apparently. Apache's fix was to just pretend as if the header was there if it was a SPDY connection; I assume nginx is doing the same thing. This suggests that node-spdy
should do the same.
from compression.
And the relevant nginx issue: http://trac.nginx.org/nginx/ticket/542
from compression.
Holly cow, I did not think such a ridiculous situation was possible. Especially since it's so old.
Here is the Firefox bug: https://support.mozilla.org/en-US/questions/995095
Thanks for clearing it all up. :)
from compression.
No problem :) I don't think Firefox's developer console wants to lie, but the last times it has, I read that the reason it does is because there is a fundamental disconnect between the headers that Firefox wants to send and what is actually sent over the wire. The console displays the former rather than the later, meaning that the headers in the console isn't actually what is sent over the wire, only what was sent down to a lower Firefox subsystem that may alter them.
i.m.o. the actions Apache/nginx took make sense, to just add the header back to any connection over SPDY. I think it may also be desirable for this module to add an option to always compress with a specific encoding, ignoring the accept-encoding header as well, for cases where you don't care what the client says :)
I'm also glad we got to the bottom of this :) I was not aware of this Firefox SPDY Accept-Encoding issue previously and will be "noodling" over it.
from compression.
Related Issues (20)
- Setting Vary header although caching is disabled HOT 1
- "drain" event listener leak when using res.once("drain"); can't use res.removeListener("drain") HOT 3
- compresssion doesn't work ,the vue.txt is 2m HOT 2
- Content-Type: application/json; charset=utf-8 No effect HOT 2
- Question: Why this middleware HOT 2
- Corrupted compressed .js-files for Mac OS / Safari -clients HOT 11
- Is compression working when node server is running on a container? HOT 2
- middleware fails when the request has more than 1 values for accept-encoding header HOT 2
- Is compression result cached? HOT 1
- change Transfer-encoding HOT 1
- Why does the data size increase after compression HOT 1
- Force size to be a minimum... HOT 2
- Chunked encoding is broken after using this middleware HOT 1
- Using a current debug version HOT 1
- Deflate backwards HOT 7
- Compression instrumentation (before/after compression hooks) HOT 2
- Angular Not Compressing? HOT 2
- compression not working json payload HOT 6
- Crash when compressing characters like ū HOT 1
- Express returns a non-compliant HTTP/206 response when gzip is enabled HOT 9
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from compression.