Giter Site home page Giter Site logo

fastify-compress's Introduction

@fastify/compress

CI NPM version js-standard-style

Adds compression utils to the Fastify reply object and a hook to decompress requests payloads. Supports gzip, deflate, and brotli.

Important note: since @fastify/compress version 4.x payloads that are compressed using the zip algorithm are not automatically uncompressed anymore. @fastify/compress main feature is to provide response compression mechanism to your server, however the zip format does not appear in the IANA maintained Table of Content Encodings and thus such behavior was out of the scope of this plugin.

Install

npm i @fastify/compress

Usage - Compress replies

This plugin adds two functionalities to Fastify: a compress utility and a global compression hook.

Currently, the following encoding tokens are supported, using the first acceptable token in this order:

  1. br
  2. gzip
  3. deflate
  4. * (no preference — @fastify/compress will use gzip)
  5. identity (no compression)

If an unsupported encoding is received or if the 'accept-encoding' header is missing, it will not compress the payload. If an unsupported encoding is received and you would like to return an error, provide an onUnsupportedEncoding option.

The plugin automatically decides if a payload should be compressed based on its content-type; if no content type is present, it will assume application/json.

Global hook

The global compression hook is enabled by default. To disable it, pass the option { global: false }:

await fastify.register(
  import('@fastify/compress'),
  { global: false }
)

Remember that thanks to the Fastify encapsulation model, you can set a global compression, but run it only in a subset of routes if you wrap them inside a plugin.

Important note! If you are using @fastify/compress plugin together with @fastify/static plugin, you must register the @fastify/compress (with global hook) before registering @fastify/static.

Per Route options

You can specify different options for compression per route by passing in the compress options on the route's configuration.

await fastify.register(
  import('@fastify/compress'),
  { global: false }
)

// only compress if the payload is above a certain size and use brotli
fastify.get('/custom-route', {
  compress: {
    inflateIfDeflated: true,
    threshold: 128,
    zlib: {
      createBrotliCompress: () => createYourCustomBrotliCompress(),
      createGzip: () => createYourCustomGzip(),
      createDeflate: () => createYourCustomDeflate()
    }
  }, (req, reply) => {
    // ...
  })

Note: Setting compress = false on any route will disable compression on the route even if global compression is enabled.

reply.compress

This plugin adds a compress method to reply that accepts a stream or a string, and compresses it based on the accept-encoding header. If a JS object is passed in, it will be stringified to JSON. Note that the compress method is configured with either the per route parameters if the route has a custom configuration or with the global parameters if the the route has no custom parameters but the plugin was defined as global.

import fs from 'fs'
import fastify from 'fastify'

const app = fastify()
await app.register(import('@fastify/compress'), { global: false })

app.get('/', (req, reply) => {
  reply
    .type('text/plain')
    .compress(fs.createReadStream('./package.json'))
})

await app.listen({ port: 3000 })

Compress Options

threshold

The minimum byte size for a response to be compressed. Defaults to 1024.

await fastify.register(
  import('@fastify/compress'),
  { threshold: 2048 }
)

customTypes

mime-db is used to determine if a content-type should be compressed. You can compress additional content types via regular expression or by providing a function.

await fastify.register(
  import('@fastify/compress'),
  { customTypes: /x-protobuf$/ }
)

or

await fastify.register(
  import('@fastify/compress'),
  { customTypes: contentType => contentType.endsWith('x-protobuf') }
)

onUnsupportedEncoding

When the encoding is not supported, a custom error response can be sent in place of the uncompressed payload by setting the onUnsupportedEncoding(encoding, request, reply) option to be a function that can modify the reply and return a string | Buffer | Stream | Error payload.

await fastify.register(
  import('@fastify/compress'),
  {
    onUnsupportedEncoding: (encoding, request, reply) => {
      reply.code(406)
      return 'We do not support the ' + encoding + ' encoding.'
    }
  }
)

Disable compression by header

You can selectively disable response compression by using the x-no-compression header in the request.

Inflate pre-compressed bodies for clients that do not support compression

Optional feature to inflate pre-compressed data if the client does not include one of the supported compression types in its accept-encoding header.

await fastify.register(
  import('@fastify/compress'),
  { inflateIfDeflated: true }
)

fastify.get('/file', (req, reply) =>
  // will inflate the file  on the way out for clients
  // that indicate they do not support compression
  reply.send(fs.createReadStream('./file.gz')))

Customize encoding priority

By default, @fastify/compress prioritizes compression as described at the beginning of §Usage - Compress replies. You can change that by passing an array of compression tokens to the encodings option:

await fastify.register(
  import('@fastify/compress'),
  // Only support gzip and deflate, and prefer deflate to gzip
  { encodings: ['deflate', 'gzip'] }
)

brotliOptions and zlibOptions

You can tune compression by setting the brotliOptions and zlibOptions properties. These properties are passed directly to native node zlib methods, so they should match the corresponding class definitions.

  server.register(fastifyCompress, {
    brotliOptions: {
      params: {
        [zlib.constants.BROTLI_PARAM_MODE]: zlib.constants.BROTLI_MODE_TEXT, // useful for APIs that primarily return text
        [zlib.constants.BROTLI_PARAM_QUALITY]: 4, // default is 4, max is 11, min is 0
      },
    },
    zlibOptions: {
      level: 6, // default is typically 6, max is 9, min is 0
    }
  });

Manage Content-Length header removal with removeContentLengthHeader

By default, @fastify/compress removes the reply Content-Length header. You can change that by setting the removeContentLengthHeader to false either on a global scope or on a route specific scope.

  // Global plugin scope
  await server.register(fastifyCompress, { global: true, removeContentLengthHeader: false });

  // Route specific scope
  fastify.get('/file', {
    compress: { removeContentLengthHeader: false }
  }, (req, reply) =>
    reply.compress(fs.createReadStream('./file.gz'))
  )

Usage - Decompress request payloads

This plugin adds a preParsing hook that decompress the request payload according to the content-encoding request header.

Currently, the following encoding tokens are supported:

  1. br
  2. gzip
  3. deflate

If an unsupported encoding or and invalid payload is received, the plugin will throw an error.

If the request header is missing, the plugin will not do anything and yield to the next hook.

Global hook

The global request decompression hook is enabled by default. To disable it, pass the option { global: false }:

await fastify.register(
  import('@fastify/compress'),
  { global: false }
)

Remember that thanks to the Fastify encapsulation model, you can set a global decompression, but run it only in a subset of routes if you wrap them inside a plugin.

Per Route options

You can specify different options for decompression per route by passing in the decompress options on the route's configuration.

await fastify.register(
  import('@fastify/compress'),
  { global: false }
)

// Always decompress using gzip
fastify.get('/custom-route', {
  decompress: {
    forceRequestEncoding: 'gzip',
    zlib: {
      createBrotliDecompress: () => createYourCustomBrotliDecompress(),
      createGunzip: () => createYourCustomGunzip(),
      createInflate: () => createYourCustomInflate()
    }
  }
}, (req, reply) => {
    // ...
  })

requestEncodings

By default, @fastify/compress accepts all encodings specified at the beginning of §Usage - Decompress request payloads. You can change that by passing an array of compression tokens to the requestEncodings option:

await fastify.register(
  import('@fastify/compress'),
  // Only support gzip
  { requestEncodings: ['gzip'] }
)

forceRequestEncoding

By default, @fastify/compress chooses the decompressing algorithm by looking at the content-encoding header, if present.

You can force one algorithm and ignore the header at all by providing the forceRequestEncoding option.

Note that if the request payload is not compressed, @fastify/compress will try to decompress, resulting in an error.

onUnsupportedRequestEncoding

When the request payload encoding is not supported, you can customize the response error by setting the onUnsupportedEncoding(request, encoding) option to be a function that returns an error.

await fastify.register(
  import('@fastify/compress'),
  {
     onUnsupportedRequestEncoding: (request, encoding) => {
      return {
        statusCode: 415,
        code: 'UNSUPPORTED',
        error: 'Unsupported Media Type',
        message: 'We do not support the ' + encoding + ' encoding.'
      }
    }
  }
)

onInvalidRequestPayload

When the request payload cannot be decompressed using the detected algorithm, you can customize the response error setting the onInvalidRequestPayload(request, encoding) option to be a function that returns an error.

await fastify.register(
  import('@fastify/compress'),
  {
    onInvalidRequestPayload: (request, encoding, error) => {
      return {
        statusCode: 400,
        code: 'BAD_REQUEST',
        error: 'Bad Request',
        message: 'This is not a valid ' + encoding + ' encoded payload: ' + error.message
      }
    }
  }
)

Note

Please note that in large-scale scenarios, you should use a proxy like Nginx to handle response compression.

Acknowledgements

Past sponsors:

License

Licensed under MIT.

fastify-compress's People

Contributors

amokmen avatar cemremengu avatar clarkdo avatar climba03003 avatar darkgl0w avatar delvedor avatar dependabot-preview[bot] avatar dependabot[bot] avatar dwickern avatar eomm avatar fdawgs avatar github-actions[bot] avatar greenkeeper[bot] avatar gurgunday avatar j-windsor avatar jimmyhurrah avatar jsumners avatar lependu avatar mcollina avatar mj-hd avatar serayaeryn avatar stanleyxu2005 avatar swap76 avatar systemdisc avatar thomheymann avatar tigt avatar tobinbradley avatar trxcllnt avatar uzlopak avatar zekth avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fastify-compress's Issues

TypeScript Integration

Hello
It's a great plugin. Thanks for your work.

I think we should add TypeScript support.
This is not a hard task. But it can improve our developer experience.

I created PR which adds types definition (#55).

Thanks!

JSON is being garbled*

🐛 Bug Report

JSON data being sent from a route is being garbled*. It doesn't seem to be determined by file size or specific fields, so far as I can tell. The problem exists whether the data is returned or sent via reply.send(data) with reply.type('application/json') being set.

*garbled is definitely a technical term.

To Reproduce

Steps to reproduce the behavior:

import compress from 'fastify-compress';

export default fp(async (fastify, config?) => {
  fastify.register(blipp);
  fastify.register(cors);
  fastify.register(jwt);
  fastify.register(oauth2);
  fastify.register(compress);
  fastify.register(routes);

});
  fastify.get('/', async function(req, reply) {
    reply.type('application/json');
    reply.send([
      {
        schools: [
          '5e5978ca347b5e7e5ddf61c1',
          '5e5978ca347b5e7e5ddf61c2',
          '5e5978ca347b5e7e5ddf61c4',
          '5e5978ca347b5e7e5ddf61c5',
          '5e5978ca347b5e7e5ddf61c6',
          '5e5978ca347b5e7e5ddf61c9',
          '5e5978ca347b5e7e5ddf61ca',
          '5e5978ca347b5e7e5ddf61cb',
          '5e5978ca347b5e7e5ddf61cc',
          '5e5978ca347b5e7e5ddf61cf',
          '5e5978ca347b5e7e5ddf61d0',
          '5e5978ca347b5e7e5ddf61d1',
          '5e5978ca347b5e7e5ddf61d2',
          '5e5978ca347b5e7e5ddf61d3'
        ],
        lotSizeUnits: 'Square Feet',
        media: [
          { url: '/tmp/5380090/1.jpg', order: 1, type: 'img' },

          { url: '/tmp/5380090/9.jpg', order: 30, type: 'img' }
        ],
        propertyType: 'Residential',
        publicRemarks:
          'Customized one-story home in a desirable cul-de-sac location on an oversized 0.387 acre lot.  This open floor plan home offers 5 bedrooms/3 baths, espresso wood-look tile floors, a separate office, dining and breakfast rooms, upgraded lighting, 3 car tandem garage, a gorgeous custom steel front door, a massive master walk-in closet, and custom built-ins in living and mud room.',
        standardStatus: 'Active',
        statusChangeTimestamp: '2018-05-09T21:12:15.500Z'
      }
    ]);
  });

I've stripped down the least amount of fields to reproduce the error. Removing fields results in expected results (JSON being returned to the client). I'm not sure if the fields themselves are having an impact but so far as I can tell, they are not. It works fine on a much larger data set that needs to get sent down as one req (512kb).

Initially I was just returning the data:

fastify.get('/', async function(req, reply){ 
  const results = fastify.models.Listings.find({}).exec()
  return await results
})

but ran into this issue after installing fastify-compress so I tried setting the content-type explicitly as above

Expected behavior

JSON to be returned to the client.

Paste the results here:

<������d"�����DA-�5��rSe�^g�a�ݪ�π�ʘ���շD�����=���-����g���嚊u�����ú��B1�TE�(�
kB���u���V�	�����
$Ђ��Iz�j0�����n��MT
���L�[�y]�Q���}�K��JUJ∏y�%��9�G�����Auq%�嫊?������̆�������F�^><�g���A�y���s�*2�T�	|غ�������("���v(je��QzJ�e�������h�2D�eT���;�ǹW�l����T�������Rf�@���x��p��f˓����w��
P�zx��Z�v�?��>&/Rp��Nч	�Z��K�f^�0��zY�����En��>9��A1�wQ����U�G`!�^_o�Ep�ԟ��y�@�����
����]�ᤘ~��Dtĕ�"D֒�������

Your Environment

  • node version: v13.8.0
  • fastify version: >=2.12.1
  • os: Mac
  • ts-node: v8.6.2
  • latest stable fastify-compress

Unexepected/unwanted behavior with zip payloads

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.24.1

Plugin version

3.7.0

Node.js version

16.13.1

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

ArchLinux with kernel 5.15.5-arch1-1

Description

Since version 0.8.0, this plugin automatically unzip payloads that have been compressed using the zip algorithm.

Looking at the IANA maintained Table of Content Encodings payloads already compressed using with 'zip' format should not be tampered with according to the scope of this plugin.

Steps to Reproduce

'use strict'

const { test } = require('tap')
const Fastify = require('fastify')
const compressPlugin = require('fastify-compress')
const AdmZip = require('adm-zip')

test('it should not uncompress payloads using the zip algorithm', async (t) => {
  t.plan(4)

  const fastify = Fastify()
  await fastify.register(compressPlugin, { threshold: 0, inflateIfDeflated: true })

  const json = { hello: 'world' }
  const zip = new AdmZip()
  zip.addFile('file.zip', Buffer.from(JSON.stringify(json), 'utf-8'))
  const fileBuffer = zip.toBuffer()

  fastify.get('/', (request, reply) => {
    reply.compress(fileBuffer)
  })

  const response = await fastify.inject({
    url: '/',
    method: 'GET',
    headers: {
      'x-no-compression': true
    }
  })
  t.equal(response.statusCode, 200)
  t.notOk(response.headers[ 'content-encoding' ])
  t.same(response.rawPayload, fileBuffer)
  t.equal(response.payload, fileBuffer.toString('utf-8'))
})

Current behavior :

TAP version 13
# Subtest: it should not uncompress payloads using the zip algorithm
    1..4
    ok 1 - should be equal
    ok 2 - expect falsey value
    not ok 3 - should be equivalent
      ---
      diff: >
        --- expected
      
        +++ actual
      
        @@ -1,7 +1,1 @@
      
        -Buffer <
      
        -  0000: 504b 0304 1400 0008 0800 3b67 8353 d141 09d8 1300 0000 1100 0000 0800 0000 6669  PK........;g.S.A..............fi
      
        -  0020: 6c65 2e7a 6970 ab56 ca48 cdc9 c957 b252 2acf 2fca 4951 aa05 0050 4b01 0214 0314  le.zip.V.H...W.R*./.IQ...PK.....
      
        -  0040: 0000 0808 003b 6783 53d1 4109 d813 0000 0011 0000 0008 0000 0000 0000 0000 0000  .....;g.S.A.....................
      
        -  0060: 00a4 8100 0000 0066 696c 652e 7a69 7050 4b05 0600 0000 0001 0001 0036 0000 0039  .......file.zipPK..........6...9
      
        -  0080: 0000 0000 00                                                                     .....
      
        ->
      
        +Buffer <7b22 6865 6c6c 6f22 3a22 776f 726c 6422 7d  {"hello":"world"}>
      at:
        line: 37
        column: 5
        file: test-zip.js
        type: Test
      stack: |
        Test.<anonymous> (test-zip.js:37:5)
      source: |2
          t.notOk(response.headers[ 'content-encoding' ])
          t.same(response.rawPayload, fileBuffer)
        ----^
          t.equal(response.payload, fileBuffer.toString('utf-8'))
        })
      ...
    
    not ok 4 - should be equal
      ---
      compare: ===
      at:
        line: 38
        column: 5
        file: test-zip.js
        type: Test
      stack: |
        Test.<anonymous> (test-zip.js:38:5)
      source: |2
          t.same(response.rawPayload, fileBuffer)
          t.equal(response.payload, fileBuffer.toString('utf-8'))
        ----^
        })
      diff: "--- expected
      
        +++ actual
      
        @@ -1,1 +1,1 @@
      
        -PK\x03\x04\x14\0\0\b\b\0;g�S�A\t�\x13\0\0\0\x11\0\0\0\b\0\0\0file.zip�V�H�\
        ��W�R*�/�IQ�\x05\0PK\x01\x02\x14\x03\x14\0\0\b\b\0;g�S�A\t�\x13\0\0\0\x11\0\0\
        \0\b\0\0\0\0\0\0\0\0\0\0\0��\0\0\0\0file.zipPK\x05\x06\0\0\0\0\x01\0\x01\06\0\
        \0\09\0\0\0\0\0
      
        +{\"hello\":\"world\"}\n"
      ...
    
    # failed 2 of 4 tests
not ok 1 - it should not uncompress payloads using the zip algorithm # time=198.813ms

1..1
# failed 1 test
# time=204.69ms

Expected Behavior

It should not uncompress zip payloads and return them in their original format.
The reproductive test example should output something like this :

TAP version 13
# Subtest: it should not uncompress payloads using the zip algorithm
    1..4
    ok 1 - should be equal
    ok 2 - expect falsey value
    ok 3 - should be equivalent
    ok 4 - should be equal
ok 1 - it should not uncompress payloads using the zip algorithm # time=153.582ms

1..1
# time=164.31ms

Compression not working for static Assets

Compression not working for Static assets.

screenshot 2019-02-22 at 13 32 55

Is there a way to enable Fastify Compress to work with Fastify-Static?

Current code:

`fastify.register(require("fastify-compress"));

fastify.register(require("fastify-static"), {
root: path.join(__dirname, "assets"),
prefix: "/assets" // optional: default '/'
});`

Thank you.

Duplicated accept-encoding vary header in response

🐛 Bug Report

If the vary header has already been set as a string with multiple concatenated values, one of which is "accept-encoding", the plugin adds a second vary header with the "accept-encoding" value.

To Reproduce

Steps to reproduce the behavior:

1- register the plugin, then set the following into a route

reply.header('vary', 'accept-encoding,accept-version');

2- Check the response

Expected behavior

The value should not be added if already set inside the vary header.

NOTE: the plugin works correctly if the vary header is set as an array, ie. reply.header('vary', ['accept-encoding','accept-version']);, but I think the string version should also be handled.

Your Environment

  • node version: 14.15.1
  • fastify version: 3.12.0
  • os: Mac

Cannot read property debug of undefined

🐛 Bug Report

TypeError: Cannot read property 'debug' of undefined
    at Object.onSend (/workspace/node_modules/fastify-compress/index.js:130:21)
    at next (/workspace/node_modules/fastify/lib/hooks.js:185:34)
    at handleResolve (/workspace/node_modules/fastify/lib/hooks.js:192:5)
    at processTicksAndRejections (internal/process/task_queues.js:97:5)

To Reproduce

Try to run in google cloud functions

app.register(fastifyCompress, options.compress)

Expected behavior

A clear and concise description of what you expected to happen.

// Paste the expected results here

Your Environment

  • node version: 12
  • fastify version: >=3.0.0
  • os: Mac, Windows, Linux
  • any other relevant information

Allow tuning of Brotli compression level

🚀 Feature Proposal

Add ability to specify brotli compression level, preferably globally

Motivation

Brotli is great for compressing static assets, where you can amortize the cost of compression across multiple responses, however in its default configuration it takes an extremely (multiple orders of magnitude vs. gzip) long time to compress data, making it a hard sell for dynamic content, particularly compared to near-instant compression provided by gzip/deflate. Given that in e.g. NestJS the compression phase happens after responses are pulled from cache on the Nest side, even aggressively server-side cached responses can't amortize exprensive compression operations across multiple requests.

Turning down the compression level on brotli may allow for acceptable compression times while still maintaining a compression ratio superior to gzip/deflate.

Example

When specifying brotli as a compression method, add a level parameter to the object. I'm guessing that the default is maximum compression (11?), and for BC reasons it'd be reasonable to keep that default if the level parameter wasn't provided.

Brotli not getting chosen

💬 Questions and Help

The readme says that compression is chosen in this order:
Brotli (br) -> gzip -> deflate

When I test it, my browser sends the following accept-encoding header:
gzip, deflate, br

I am curious as to why brotli isn't being selected? Is there something I am doing wrong?

My code just registers fastify-compress without anything special, version 2.0.0

An in-range update of fastify is breaking the build 🚨

The devDependency fastify was updated from 2.11.0 to 2.12.0.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

fastify is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Release Notes for v2.12.0

📚 PR:

  • fix: skip serialization for json string (#1937)
  • Added fastify-casl to Community Plugins (#1977)
  • feature: added validation context to validation result (#1915)
  • ESM support (#1984)
  • fix: adjust hooks body null (#1991)
  • Added mcollina's plugin "fastify-secure-session" (#1999)
  • Add fastify-typeorm-plugin to community plugins (#1998)
  • Remove Azure Pipelines (#1985)
  • Update validation docs (#1994)
  • Drop Windows-latest and Node 6 from CI as its failing. (#2002)
  • doc: fix esm-support anchor (#2001)
  • Docs(Fluent-Schema.md): fix fluent schema repo link (#2007)
  • fix - docs - hooks - error handling (#2000)
  • add fastify-explorer to ecosystem (#2003)
  • Add a recommendations doc (#1997)
  • Fix TOC typo in recommendations (#2009)
  • docs(Typescript): typo (#2016)
  • docs: fix onResponse parameter (#2020)
  • Update template bug.md (#2025)
  • fix replace way enum (#2026)
  • docs: update inject features (#2029)
  • Update Copyright Year to 2020 (#2031)
  • add generic to typescript Reply.send payload (#2032)
  • Shorten longest line (docs) (#2035)
  • docs: OpenJS CoC (#2033)
  • Workaround for using one schema for multiple routes (#2044)
  • docs: inject chainable methods (#1917) (#2043)
  • http2: handle graceful close (#2050)
  • chore(package): update fluent-schema to version 0.10.0 (#2057)
  • chore(package): update yup to version 0.28.1 (#2059)
  • Update README.md (#2064)
  • Added fastify-response-validation to ecosystem (#2063)
  • fix: use opts of onRoute hook (#2060)
  • Fixed documentation typo (#2067)
  • Add missing TS definition for ServerOptions.genReqId function arg (#2076)
  • fix: throw hooks promise rejection (#2070) (#2074)
  • Add docs to stop processing hooks with async (#2079)
Commits

The new version differs by 38 commits.

  • 7a37924 Bumped v2.12.0
  • aacefcd Add docs to stop processing hooks with async (#2079)
  • c052c21 fix: throw hooks promise rejection (#2070) (#2074)
  • 6b39870 Add missing TS definition for ServerOptions.genReqId function arg (#2076)
  • 7fa4bdd Fixed documentation typo (#2067)
  • 6b73e0a fix: use opts of onRoute hook (#2060)
  • 7bb9733 Added fastify-response-validation to ecosystem (#2063)
  • 0a1c1f0 Update README.md (#2064)
  • bd9f608 chore(package): update yup to version 0.28.1 (#2059)
  • e19d078 chore(package): update fluent-schema to version 0.10.0 (#2057)
  • d0c976e http2: handle graceful close (#2050)
  • af8a6ac docs: inject chainable methods (#1917) (#2043)
  • 62f21b1 Workaround for using one schema for multiple routes (#2044)
  • 5258f42 docs: OpenJS CoC (#2033)
  • ac46905 Shorten longest line (docs) (#2035)

There are 38 commits in total.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

More than one wildcard directive in accept-encoding header causes fall back to `br` not `gzip`

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.22.0

Plugin version

3.6.0

Node.js version

14.18.1

Operating system

Windows

Operating system version (i.e. 20.04, 11.3, 10)

10

Description

In the documentation it states that if the * wildcard directive is used in the Accept-Encoding request header then the plugin will use gzip. However, if multiple wildcard directives are included in the Accept-Encoding request header then it uses br.

I believe this is due to the following line only replacing the first occurrence of a *:

.replace('*', 'gzip') // consider the no-preference token as gzip for downstream compat

Suggested fix:
.replace(/\*/g, 'gzip')

This also occurs in fastify-static:
https://github.com/fastify/fastify-static/blob/master/index.js#L436

Happy to open a PR for this.

Steps to Reproduce

  1. Stand up server:
const fastify = require("fastify")();
const { createReadStream } = require("fs");

// Import plugins
const compress = require("fastify-compress");

async function server() {
	fastify.register(compress, { inflateIfDeflated: true }).route({
		method: "GET",
		url: "/foo",
		handler: (req, res) => {
			res.type("text/plain").compress(createReadStream("./package.json"));
		},
	});

	await fastify.listen({ port: 8000 });
}

server();
  1. Make a request with more than one wildcard directive present:

Request:

> GET /foo HTTP/1.1
> Host: localhost:8000
> User-Agent: insomnia/2021.5.3
> accept-encoding: *, *
> Accept: */*

Response:

< HTTP/1.1 200 OK
< content-type: text/plain
< vary: accept-encoding
< content-encoding: br
< Date: Wed, 13 Oct 2021 10:12:46 GMT
< Connection: keep-alive
< Keep-Alive: timeout=5
< Transfer-Encoding: chunked
  1. Compare to a request with a single wildcard directive present:

Request:

> GET /foo HTTP/1.1
> Host: localhost:8000
> User-Agent: insomnia/2021.5.3
> accept-encoding: *
> Accept: */*

Response:

< HTTP/1.1 200 OK
< content-type: text/plain
< vary: accept-encoding
< content-encoding: gzip
< Date: Wed, 13 Oct 2021 10:20:21 GMT
< Connection: keep-alive
< Keep-Alive: timeout=5
< Transfer-Encoding: chunked

Expected Behavior

Content-Encoding value should be gzip.

An in-range update of iltorb is breaking the build 🚨

The devDependency iltorb was updated from 2.4.4 to 2.4.5.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

iltorb is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Commits

The new version differs by 1 commits.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

100% code coverage

As titled, this repo would benefit from having 100% code coverage (-100 flag in tap).

An in-range update of @typescript-eslint/parser is breaking the build 🚨

The devDependency @typescript-eslint/parser was updated from 1.9.0 to 1.10.0.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

@typescript-eslint/parser is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Bug: Quality value syntax not supported

It seems like fastify-compress does not support Quality Value syntax in the Accepts-Encoding header. This values for example causes a 406 Unsupported Encoding HTTP error to be returned.

Looking int the source of index.js:128 in onSend the encoding variable is indeed null. This error goes away once removing the q directives from the header value.

Example Accepts-Encoding header value that we fail to parse:
gzip;q=1.0,deflate;q=0.6,identity;q=0.3

Quickly looking at the source, we will likely need to updategetEncodingHeader to support this.

Version 10 of node.js has been released

Version 10 of Node.js (code name Dubnium) has been released! 🎊

To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:

  • Added the new Node.js version to your .travis.yml

If you’re interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.

More information on this issue

Greenkeeper has checked the engines key in any package.json file, the .nvmrc file, and the .travis.yml file, if present.

  • engines was only updated if it defined a single version, not a range.
  • .nvmrc was updated to Node.js 10
  • .travis.yml was only changed if there was a root-level node_js that didn’t already include Node.js 10, such as node or lts/*. In this case, the new version was appended to the list. We didn’t touch job or matrix configurations because these tend to be quite specific and complex, and it’s difficult to infer what the intentions were.

For many simpler .travis.yml configurations, this PR should suffice as-is, but depending on what you’re doing it may require additional work or may not be applicable at all. We’re also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, I’m a humble robot and won’t feel rejected 🤖


FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Native Brotli implementation

Hey 👋

As Brotli is natively available in Node 11, could we use that (if the used Node version supports it) by default? 🤔

Should fastify-compress handle compressed income requests?

🚀 Feature Proposal

A clear and concise description of what the feature is.

Fastify currently does not accept requests with Content-Encoding: gzip in its normal form, and since there is a generic module for compression-related work in the Fastify ecosystem, I think it would be good to extend its capabilities to support compressed incoming requests.

Motivation

Compression in HTTP is useful in HTTP requests too, not only in HTTP responses.

Example

An HTTP client would send a compressed HTTP request to a Fastify server using fastify-compress, and it would be handled properly.

Premature close

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.27.4

Plugin version

4.0.1

Node.js version

16.13.1

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

12.0.1

Description

Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
  at new NodeError (node:internal/errors:371:5)
  at Duplexify.onclose (node:internal/streams/end-of-stream:135:30)
  at Duplexify.emit (node:events:402:35)
  at Duplexify.emit (node:domain:475:12)
  at Duplexify._destroy (/Users/gajus/Documents/dev/contra/contra-web-app/node_modules/duplexify/index.js:199:8)
  at /Users/gajus/Documents/dev/contra/contra-web-app/node_modules/duplexify/index.js:182:10
  at processTicksAndRejections (node:internal/process/task_queues:78:11)

Steps to Reproduce

This appears to happen when request is aborted while still serving the file.

Expected Behavior

Should produce a warning, but not an uncaught error.

An in-range update of @typescript-eslint/parser is breaking the build 🚨

The devDependency @typescript-eslint/parser was updated from 2.17.0 to 2.18.0.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

@typescript-eslint/parser is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Release Notes for v2.18.0

2.18.0 (2020-01-27)

Bug Fixes

  • eslint-plugin: [explicit-module-boundary-types] false positive for returned fns (#1490) (5562ad5)
  • improve token types and add missing type guards (#1497) (ce41d7d)
  • eslint-plugin: [naming-convention] fix filter option (#1482) (718cd88)
  • eslint-plugin: fix property access on undefined error (#1507) (d89e8e8)
  • experimental-utils: widen type of settings property (#1527) (b515e47)
  • typescript-estree: error on unexpected jsdoc nodes (#1525) (c8dfac3)
  • typescript-estree: fix identifier tokens typed as Keyword (#1487) (77a1caa)

Features

  • eslint-plugin: add comma-spacing (#1495) (1fd86be)
  • eslint-plugin: add new rule prefer-as-const (#1431) (420db96)
  • eslint-plugin: create ban-ts-comment rule (#1361) (2a83d13)
  • eslint-plugin-internal: add prefer-ast-types-enum (#1508) (c3d0a3a)
  • experimental-utils: make RuleMetaData.docs optional (#1462) (cde97ac)
  • parser: improve scope-analysis types (#1481) (4a727fa)
Commits

The new version differs by 30 commits.

  • b835ec2 chore: publish v2.18.0
  • 367b18f docs(eslint-plugin): add script to generate the readme tables (#1524)
  • 03221d2 test: fix vscode launch configuration for windows (#1523)
  • f991764 chore(eslint-plugin): refactor explicit return type rules to share code (#1493)
  • c8dfac3 fix(typescript-estree): error on unexpected jsdoc nodes (#1525)
  • 6d1d2a2 test: fix coverage reports from codecov (#1528)
  • b515e47 fix(experimental-utils): widen type of settings property (#1527)
  • 67784d6 docs: extra 'a' in CONTRIBUTING.md (#1518)
  • afa7900 chore: enable prefer-ast-types-enum internal rule (#1514)
  • c3d0a3a feat(eslint-plugin-internal): add prefer-ast-types-enum (#1508)
  • 718cd88 fix(eslint-plugin): [naming-convention] fix filter option (#1482)
  • 802e347 chore(eslint-plugin): use getFixturesRootDir in tests (#1506)
  • 9ca65dc chore: update istanbul-reports to make tests quiet (#1509)
  • d89e8e8 fix(eslint-plugin): fix property access on undefined error (#1507)
  • 06731e7 test(eslint-plugin): cleanup no-use-before-define tests (#1505)

There are 30 commits in total.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Fastify post requests

💬 Does fastify-compress support post requests compression?

PostGraphile with Fastify

I have been using PostGraphile with Fastify and am loving it.
However, there seems to be an issue with compression with POST requests sent using Postman or on the browser.

// short snippet of my code.
const fp = require("fastify-plugin");
const AutoLoad = require("fastify-autoload");
const connectToPostgreSQL = require("./db_connection/connectToPostgreSQL");
const registerEnv = require("./utils/register_env");
const compression = require("fastify-compress");
const helmet = require("fastify-helmet");

fastify
    .register(fp(registerEnv))
    .register(fp(connectToPostgreSQL))
    .register(compression, { global: true, threshold: 0 })
    .register(helmet)
    .ready(err => {
      if (err) return console.error(err);
      fastify.log.info("-----------------------------------------------------");
    });

May I know if there are anything that I have had done wrongly?

An in-range update of @types/node is breaking the build 🚨

The devDependency @types/node was updated from 12.12.9 to 12.12.10.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

@types/node is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

How to compress the output of an ejs template rendered reply without set the global flag

Hello
i'm doing something like that :
reply.view('/src/templates/hello.ejs', { hello: 'world' })
I'd like to compress the output of my ejs template, .compress will not work directly and I don't want to activate the global flag
Do you know how can I do that ? can we enrich the component with a compressOnSend method to accept to be chained and activate the onsend hook ?
reply.view('/src/templates/hello.ejs', { hello: 'world' }).compressOnSend()
Thanks

Conflict with aws-serverless-express

When using this module, my Lambda app stopping work and somehow causes error and content isn't shown.

I firstly think issue relates to fastify, but isn't, bug related to this plugin

About warning compress: missing payload'

In global mode, I'm getting "compress: missing payload" when I use reply.redirect.

Logging this warning directly in the onSend global hook when there's no payload seems a bit too much

if (payload == null) {
reply.res.log.warn('compress: missing payload')
return next()
}

fastify decompress using preParsing hook

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.21.6

Plugin version

No response

Node.js version

16.2

Operating system

Windows

Operating system version (i.e. 20.04, 11.3, 10)

10

Description

I am using fastify compress and it is working fine to compress a large string and returns to the client with no issues.
My problem is when the client sends content-encoding gzip and content-length of 462 with gzip'd payload, The uncompressed length is 1325 bytes.

authorization: 'Basic xxxxxxxxxxxxxxxxxxxxxx==',
'accept-encoding': 'br, deflate, gzip',
'content-type': 'text/plain',
'content-encoding': 'gzip',
'content-length': '462',
host: 'localhost:3000',
connection: 'close'

I appear to have 2 problems..
When the preparsing hook fires, I have looked in the payload and do not see anything that remotely looks like the 426 bytes of gzipped content. I don't appear to see it anywhere.

server.addHook("preParsing", (request, reply, payload, done) => {
  done(null, payload);
});

Where in the payload field is the inbound data expected to be found?

When the preparsing hook exits, fastify reports this:
Request body size did not match Content-Length
"name": "FastifyError",
"code": "FST_ERR_CTP_INVALID_CONTENT_LENGTH",
"statusCode": 400

If I switch my client to point at our current Tomcat server it works fine.

Steps to Reproduce

Not easy as there is a lot of code involved here with many external dependencies.

Expected Behavior

Would expect to have uncompressed data on exit from preParsing hook..

Do not compress event-stream

Hi,

With this module enabled it also compresses text/event-stream Content-type, whch is the default for SSE events (fastify-sse module for example).
The SSE output gets compressed and the browser does not trigger anymore SSE events onmessage

I suggest the regexp
const compressibleTypes = /^text\/|\+json$|\+text$|\+xml$/
should be changed to not allow text/event-stream but allow text/

this happens even if the browser specifies that it accepts "accept-encoding:gzip, deflate, br"
(think is a browser issue, but...chrome and firefox are pretty established to ignore it)

Update: in the end I added x-no-compression header to the request on the server side, but still I think it should be avoided by default by the module

Corrupted reply with async prefixed routes and br encoding

🐛 Bug Report

The JSON response becomes corrupted when using br compression as well as prefixed async routes, This may be the unresolved cause of #104

To Reproduce

Steps to reproduce the behavior:

// package.json
{
  "name": "node_test",
  "version": "1.0.0",
  "main": "index.js",
  "scripts": {
    "start": "node index.js"
  },
  "dependencies": {
    "fastify": "^3.9.2",
    "fastify-compress": "^3.4.0"
  }
}
// index.js
const server = require('fastify')();

server.register(require('fastify-compress'), {
    threshold: 0,
    // the issue disappears by excluding 'br' from the encodings
    // encodings: ['gzip', 'deflate', 'identity']
});

server.register(require('./routes'));

server.listen(8080);
// routes/index.js
module.exports = async (fastify, _) => {
    fastify.register(require('./test'), {
        prefix: '/test', // if this prefix is removed, the issue disappears
    });
}
// routes/test.js
module.exports = async (fastify, opts) => {
    fastify.get('/', opts, async (req, _) => {
        return { hi: true };
    });
}

If a GET request is made to http://localhost:8080/test the response body is:

���{"hi":true}

Note the expected behavior is produced if the prefix in routes/index.js is removed, or if 'br' encoding is explicitly removed when registering fastify-compress

Expected behavior

{"hi":true}

Your Environment

  • node version: v14.15.1
  • fastify version: >=3.9.2
  • os: Windows 10
  • GET request was made with the chrome fetch API & postman and results were the same

brotli still in supportedEncodings even when brotli option is not set.

If the accept-encoding header includes 'br' the code tries to use the brotli compressStream even when the brotli option is not present since it's still in the supportedEncodings array.

Req:
accept-encoding: "br, gzip, deflate"

Res:

Message: compressStream[encoding] is not a function
Object.Object.onSend in /app/node_modules/fastify-compress/index.js:110
Object.next in /app/node_modules/fastify/lib/hookRunner.js:47
Object.onSendHookRunner in /app/node_modules/fastify/lib/hookRunner.js:61
Object.onSendHook in /app/node_modules/fastify/lib/reply.js:140
_Reply._Reply.Reply.send in /app/node_modules/fastify/lib/reply.js:80
result.result.then in /app/node_modules/fastify/lib/handleRequest.js:92
unknown.[anonymous]:null

Conditionally Enable Content-Length Header

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

I'm currently using Fastify and Fastify-Compress for a project.

Spent quite a while trying to figure out why the downloads from my Fastify server are reporting unknown in size. I noticed that after disabling compression, my Content-Length header is not being stripped from the output.

Doing some research on previous issues it appears there was one issue that was the core reason for it's removal (fastify/fastify-static#52). As I require this feature in my project it would be great if the 'Content-Length' header was not stripped from the request response. If its by default disabled, that is fine, as long as you can turn it on.

An in-range update of @types/node is breaking the build 🚨

The devDependency @types/node was updated from 12.6.9 to 12.7.0.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

@types/node is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

An in-range update of fastify is breaking the build 🚨

The devDependency fastify was updated from 1.13.3 to 1.13.4.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

fastify is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Release Notes for v1.13.4

Enhancements

  • move ECONNRESET log from error to debug - #1363

Fixes

  • Fix setting multiple cookies as multiple 'Set-Cookie' headers. - #1360
  • fix: #1353 ignore evaluation of $schema field in json-schema - #1354

Documentation

  • Update copyright year to 2019 - #1364
Commits

The new version differs by 6 commits.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Add ability to append 'vary' response header rather than create a new one

🚀 Feature Proposal

At present, this plugin creates a new 'vary' header, even if one already exists, leading to the vary header becoming an array internally (vary: ["Origin", "accept-encoding"]) and presenting as multiple headers in a response:

vary: Origin
vary: accept-encoding

It would be good if a plugin could append an existing vary header i.e. vary: Origin, accept-encoding, similar to what the fastify-cors plugin does, preferably with an optional boolean option in the config.

Motivation

Whilst the above example is perfectly valid, some clients struggle with multiple headers with the same name (legacy medical systems basically).

Example

fastify.register(
  require('fastify-compress'),
  { appendVaryHeader: true }
)

An in-range update of iltorb is breaking the build 🚨

The devDependency iltorb was updated from 2.4.2 to 2.4.3.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

iltorb is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Commits

The new version differs by 7 commits.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

feat: move to `compress` and `decompress` route specific options, improved typescript types and tests, reorganize tests

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Hello,

following the PR #196, I would like to work on a few things for this package that will help improve development experience:

  • Add compress and decompress route shorthand configuration properties
  • Improve typescript types and actually test those types
  • Update the documentation
  • Reorganize and/or rewrite tests (renaming files following xxx.test.js convention, making groups with sub-tests, ...)

Note: some of those changes are breaking changes and will require a major bump.

If you think my proposal is acceptable the first 3 points are covered by this PR #199.

Regards.

Motivation

  • It will follow the way over official packages define specific route options, it will be shorter ( 🤣 ) and more intuitive.
  • It will have complete typings and those types will actually be tested.

Example

Current way to configure specific route options:

'use strict'

const zlib = require('zlib')
const fastify = require('fastify')()

fastify.register(require('fastify-compress', { global: false })

fastify.get('/one', {
  config: {
    compress: { createGzip: () => zlib.createGzip() },
    decompress: { createGunzip: () => zlib.createGunzip() }
  }
}, (request, reply) => {
 // ...
})

fastify.route({
  method: 'GET',
  path: '/two',
  config: {
    compress: { createGzip: () => zlib.createGzip() },
    decompress: { createGunzip: () => zlib.createGunzip() }
  },
  handler: (request, reply) => {
   // ...
  }
})

New way to configure specific route options:

'use strict'

const zlib = require('zlib')
const fastify = require('fastify')()

fastify.register(require('fastify-compress', { global: false })

fastify.get('/one', {
  compress: { createGzip: () => zlib.createGzip() },
  decompress: { createGunzip: () => zlib.createGunzip() }
}, (request, reply) => {
 // ...
})

fastify.route({
  method: 'GET',
  path: '/two',
  compress: { createGzip: () => zlib.createGzip() },
  decompress: { createGunzip: () => zlib.createGunzip() },
  handler: (request, reply) => {
   // ...
  }
})

brotli compress

Hello, then i use compress brotli content encoding in browser gz

Remove unnecessary log

#37

I don't think there needs to be a log at all in the case where isn't any data to compress. The plugin should just be a noop in this case.

As it currently stands, I had no idea where this "compress: missing payload" log was coming from until I did a web search (https://duckduckgo.com/?q="compress%3A+missing+payload"&t=vivaldi&ia=web&iai=r1-0&page=1&adx=sltb&sexp={"v7exp"%3A"a"%2C"sltexp"%3A"b"}) and the only returned result was the original issue about this log message. So at a minimum this log needs to be clearer about where it comes from.

An in-range update of unzipper is breaking the build 🚨

The dependency unzipper was updated from 0.10.5 to 0.10.6.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

unzipper is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Commits

The new version differs by 1 commits.

  • e91734d HOTFIX: Fix pipecount (#169)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Not compliant with RFC 7231

🐛 Bug Report

https://tools.ietf.org/html/rfc7231#section-5.3.4

  • "If an Accept-Encoding header field is present in a request
    and none of the available representations for the response have a
    content-coding that is listed as acceptable, the origin server SHOULD
    send a response without any content-coding."
  • "All content-coding values are case-insensitive"

To Reproduce

  • A request with the header Accept-Encoding: GzIP responds HTTP406.
  • A request with the header Accept-Encoding: Invalid responds HTTP406.

Expected behavior

  • A request with the header Accept-Encoding: GzIP should respond with gzip compressed payload.
  • A request with the header Accept-Encoding: Invalid should respond with uncompressed payload.

Your Environment

  • node version: v12.13.0
  • fastify version: 2.10.0
  • os: Mac

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.