Giter Site home page Giter Site logo

fastify-reply-from's Introduction

CI Package Manager CI Web SIte js-standard-style CII Best Practices

NPM version NPM downloads Security Responsible Disclosure Discord Contribute with Gitpod Open Collective backers and sponsors


An efficient server implies a lower cost of the infrastructure, a better responsiveness under load and happy users. How can you efficiently handle the resources of your server, knowing that you are serving the highest number of requests as possible, without sacrificing security validations and handy development?

Enter Fastify. Fastify is a web framework highly focused on providing the best developer experience with the least overhead and a powerful plugin architecture. It is inspired by Hapi and Express and as far as we know, it is one of the fastest web frameworks in town.

The main branch refers to the Fastify v4 release. Check out the v3.x branch for v3.

Table of Contents

Quick start

Create a folder and make it your current working directory:

mkdir my-app
cd my-app

Generate a fastify project with npm init:

npm init fastify

Install dependencies:

npm i

To start the app in dev mode:

npm run dev

For production mode:

npm start

Under the hood npm init downloads and runs Fastify Create, which in turn uses the generate functionality of Fastify CLI.

Install

To install Fastify in an existing project as a dependency:

Install with npm:

npm i fastify

Install with yarn:

yarn add fastify

Example

// Require the framework and instantiate it

// ESM
import Fastify from 'fastify'
const fastify = Fastify({
  logger: true
})
// CommonJs
const fastify = require('fastify')({
  logger: true
})

// Declare a route
fastify.get('/', (request, reply) => {
  reply.send({ hello: 'world' })
})

// Run the server!
fastify.listen({ port: 3000 }, (err, address) => {
  if (err) throw err
  // Server is now listening on ${address}
})

with async-await:

// ESM
import Fastify from 'fastify'
const fastify = Fastify({
  logger: true
})
// CommonJs
const fastify = require('fastify')({
  logger: true
})

fastify.get('/', async (request, reply) => {
  reply.type('application/json').code(200)
  return { hello: 'world' }
})

fastify.listen({ port: 3000 }, (err, address) => {
  if (err) throw err
  // Server is now listening on ${address}
})

Do you want to know more? Head to the Getting Started.

Note

.listen binds to the local host, localhost, interface by default (127.0.0.1 or ::1, depending on the operating system configuration). If you are running Fastify in a container (Docker, GCP, etc.), you may need to bind to 0.0.0.0. Be careful when deciding to listen on all interfaces; it comes with inherent security risks. See the documentation for more information.

Core features

  • Highly performant: as far as we know, Fastify is one of the fastest web frameworks in town, depending on the code complexity we can serve up to 76+ thousand requests per second.
  • Extensible: Fastify is fully extensible via its hooks, plugins and decorators.
  • Schema based: even if it is not mandatory we recommend to use JSON Schema to validate your routes and serialize your outputs, internally Fastify compiles the schema in a highly performant function.
  • Logging: logs are extremely important but are costly; we chose the best logger to almost remove this cost, Pino!
  • Developer friendly: the framework is built to be very expressive and help the developer in their daily use, without sacrificing performance and security.

Benchmarks

Machine: EX41S-SSD, Intel Core i7, 4Ghz, 64GB RAM, 4C/8T, SSD.

Method:: autocannon -c 100 -d 40 -p 10 localhost:3000 * 2, taking the second average

Framework Version Router? Requests/sec
Express 4.17.3 14,200
hapi 20.2.1 42,284
Restify 8.6.1 50,363
Koa 2.13.0 54,272
Fastify 4.0.0 77,193
-
http.Server 16.14.2 74,513

Benchmarks taken using https://github.com/fastify/benchmarks. This is a synthetic, "hello world" benchmark that aims to evaluate the framework overhead. The overhead that each framework has on your application depends on your application, you should always benchmark if performance matters to you.

Documentation

中文文档地址

Ecosystem

  • Core - Core plugins maintained by the Fastify team.
  • Community - Community supported plugins.
  • Live Examples - Multirepo with a broad set of real working examples.
  • Discord - Join our discord server and chat with the maintainers.

Support

Please visit Fastify help to view prior support issues and to ask new support questions.

Contributing

Whether reporting bugs, discussing improvements and new ideas or writing code, we welcome contributions from anyone and everyone. Please read the CONTRIBUTING guidelines before submitting pull requests.

Team

Fastify is the result of the work of a great community. Team members are listed in alphabetical order.

Lead Maintainers:

Fastify Core team

Fastify Plugins team

Great Contributors

Great contributors on a specific area in the Fastify ecosystem will be invited to join this group by Lead Maintainers.

Past Collaborators

Hosted by

We are a At-Large Project in the OpenJS Foundation.

Sponsors

Support this project by becoming a SPONSOR! Fastify has an Open Collective page where we accept and manage financial contributions.

Acknowledgements

This project is kindly sponsored by:

Past Sponsors:

This list includes all companies that support one or more of the team members in the maintenance of this project.

License

Licensed under MIT.

For your convenience, here is a list of all the licenses of our production dependencies:

  • MIT
  • ISC
  • BSD-3-Clause
  • BSD-2-Clause

fastify-reply-from's People

Contributors

anyonecancode avatar cemremengu avatar climba03003 avatar coreyfarrell avatar deanhaleem avatar delvedor avatar dependabot-preview[bot] avatar dependabot[bot] avatar dnlup avatar eomm avatar fdawgs avatar felixputera avatar github-actions[bot] avatar greenkeeper[bot] avatar ivan-tymoshenko avatar jkyberneees avatar jsumners avatar maxfrigge avatar mcollina avatar mikepresman avatar nileshmali avatar psteinroe avatar rafaelgss avatar rluvaton avatar salmanm avatar simenb avatar skywickenden avatar uzlopak avatar vincent178 avatar yohayg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fastify-reply-from's Issues

The \"string\" argument must be of type string or an instance of Buffer or ArrayBuffer. Received [Object: null prototype]

🐛 Bug Report

When using HTTP, the request type is content- Type:application/x-www-form-urlencoded , Buffer.byteLength (body) error: the "string" argument must be of type string or an instance of buffer or arraybuffer. Received [object: null prototype]

To Reproduce

Steps to reproduce the behavior:

curl -X POST "http://127.0.0.1:3000/changeSignStatus" -H "accept:*/*" -H "Content-Type:application/x-www-form-urlencoded; charset=UTF-8" -d "key=1152&status=1"

Your Environment

  • node version: v14.4.0
  • fastify version: ^3.8.0
  • os: Mac
  • any other relevant information

Crash: Source crashes when http2 target goes down or becomes unavailable

Scenario 1:
Start source server without starting client. Source process crashes immediatly as it tries to create client connection eagerly.

Scenario 2:

  1. Start target and then source
  2. Shutdown target service
  3. Hit proxied request
    Source server crashes as http2 client session becomes invalid after target shutdown

Proposed Solution:

  1. Lazy initialization of http2Client on first request.
  2. Setup error event listener on http2Client.
    Planning to return status 500 with Internal Server Error as message. Any suggestions?
  3. Check http2Client.destroyed flag to determine whether session is valid or not and attempt to reconnect.

Pull request will follow shortly.

request.from(...) to a fastify service with multipart/form-data leads to ECONNRESET

🐛 Bug Report

I configured two web services, both running on the top of the fastify ecosystem:

  • my-reverse-proxy: uses fastify-multipart because some endpoints (not reported in the example) have to use it. If incoming requests respect some logic, these are forwarded to...

  • my-end-service: handles a http multipart/form-data call carrying a file and some metadata

To Reproduce

I prepared this repo and you can follow the instruction provided in the README.md file

Steps to reproduce the behavior:
See this repo

Expected behavior

The request if forwarded by fastify-reply-from to the service which has to handle the call.
This service can handle all the parts of the multipart/form-data correctly.

Your Environment

- node: erbium
- fastify: v3.8.0
- fastify-reply-from: v3.4.0
- fastify-multipart: v3.3.1
- docker-compose version 1.27.4, build 40524192

An in-range update of got is breaking the build 🚨

The devDependency got was updated from 9.2.1 to 9.2.2.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

got is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Release Notes for v9.2.2
  • Gracefully handle invalid Location redirect URLs. (#605) 7ae6939
  • Don't override hooks when merging arguments. 3ad3950
  • Merge hooks on got.extend(). (#608) 292f78a

v9.2.1...v9.2.2

Commits

The new version differs by 4 commits.

  • 248d68c 9.2.2
  • 3ad3950 Don't override hooks when merging arguments
  • 292f78a Merge hooks on got.extend() (#608)
  • 7ae6939 Gracefully handle invalid Location redirect URLs (#605)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

http2 support for target server

I would like to use this as proxy for http2 target fastify server. But right now it seems that it does not support target as http2.
If you could provide guidance I will provide pull request for http2 support and request id tracking.
Please let me know.

Using `rewriteRequestHeaders` with async/await does not work properly

🐛 Bug Report

It seems that when using an async callback for rewriteRequestHeaders an empty object gets returned and the API overrides the headers. Original I found bug while using https://github.com/fastify/fastify-http-proxy

To Reproduce

Steps to reproduce the behavior:

const geoip = require("fast-geoip");
const Fastify = require("fastify");

const target = Fastify({
	logger: true,
});

target.get("/", (request, reply) => {
  console.log(request.headers);
	reply.send("hello world");
});

const proxy = Fastify({
	logger: true,
});

proxy.register(require("fastify-reply-from"), {
	base: "http://localhost:3001/",
	rewriteRequestHeaders: async (request, headers) => {
		const {country} = await geoip.lookup("192.11.111.111");
                return {...headers, country, time: Date.now()}
	},
});

Expected behavior

Expecting to receive the headers from the upstream along with country and time headers

Your Environment

  • node version: 14
  • fastify version: latest
  • os: Mac

Documentation mismatch: reply.from options.body

🐛 Bug Report

Hello!
There is a documentation mismatch for reply.from options.body field.

body

Replaces the original request body with what is specified. Unless
[contentType][contentType] is specified, the content will be passed
through JSON.stringify().
Setting this option will not verify if the http method allows for a body.

But, if body is defined for a GET request, an error will be thrown:

{
    "statusCode": 500,
    "error": "Internal Server Error",
    "message": "Rewriting the body when doing a GET is not allowed"
}

I've read fastify/fastify#953 and I understand that Fastify will not process the body for GET requests (which is RFC compliant).

My question: the docs should be updated OR if reply.from options.body is defined, to bypass body verification?

We are using this amanzing HTTP stack(fastify, fastify-reply-from, undici) for a lot of projects and it's great.
But, for some legacy projects, where we have GET requests with body, we encountered some issues.

Thank you!

To Reproduce

Steps to reproduce the behavior:

const Fastify = require('fastify');

const upstream = Fastify({logger: true});

upstream.all('*', (request, reply) => {
  reply.send(`hello world: ${request.body}`)
});

upstream.listen(9090);


const server = Fastify({logger: true});
server.register(require('fastify-reply-from'), {base: 'http://localhost:9090'});

server.all('*', (request, reply) => {
  reply.from(request.url, {body: 'custom body'});
});

server.listen(8080);
curl --request GET 'http://localhost:8080/'

Expected behavior

hello world: custom body

Your Environment

  • node version: 14.16.0
  • fastify version: 3.13.0
  • fastify-reply-from version: 5.0.1
  • os: Mac

New undici pool is created on each request

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.22.1

Plugin version

6.4.0

Node.js version

16.11

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

11.3

Description

proxy will create a new connection pool on each request, which will produce unlimited number of connections to single host and high resource usage (also performance decrease)

https://github.com/fastify/fastify-reply-from/blob/master/lib/request.js#L133

Steps to Reproduce

create proxy:

const app = require('fastify')({
  logger: true
});

app.register(require('fastify-reply-from'), {
  base: 'http://localhost:3010/',
  undici: {
    connections: 10,
  },
})

app.get('/alive', (request, reply) => {
  reply.from('/alive')
})

app.listen(3000, (err) => {
  if (err) {
    throw err
  }
})

create target

const fastify = require('fastify')({
  logger: true,
  keepAliveTimeout: 30000,
})

fastify.get('/alive', (request, reply) => {
  reply.send({ hello: 'world' })
})

// Run the server!
fastify.listen(3010, (err, address) => {
  if (err) throw err
  // Server is now listening on ${address}
})

run autocannon -r 20 http://localhost:3000/alive -f

then check number of file descriptors (open sockets) both on target and proxy
lsof -a -p <PID> | wc -l

Expected Behavior

Since number of connections is limited to 10 per origin (host), expected limited number of file descriptors(open sockets), but the number continue to grow to thousands

fastify-reply-from should use build-in request function implemented in undici
it uses Agent singleton with a single pool per host

p.s with plugin 5.x it works as expected

`buildURL` doesn't handle `base` with port 80

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.21.4

Plugin version

6.0.1

Node.js version

14.x

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

11.6

Description

If base contains :80, the buildURL function throws.

Steps to Reproduce

const { buildURL } = require('fastify-reply-from/lib/utils')

buildURL('/graphql', 'http://localhost:80')

This throws - removing the :80 works.

(new URL strips out the port for http URLs (and 443 for https))

Expected Behavior

It should gracefully handle the default port being provided

Support for request tracking

In ToDo list we have:

forward the request id to the other peer might require some refactoring because we have to make the req.id unique

Currently to achieve this I am overriding genReqId of logger and using hyperid to generate unique ids, then I set x-request-id header with this value. This allows me to propagate same request-id down to micro service.

My suggestion would be to do this in fastify by setting x-request-id or custom request header with hyperid generated value. Currently in fastify request id is generated using simple integer counter.

Please suggest if you have another plan in mind, let me know. I would be happy to submit pull request.

An in-range update of simple-get is breaking the build 🚨

Version 3.0.3 of simple-get was just published.

Branch Build failing 🚨
Dependency simple-get
Current Version 3.0.2
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

simple-get is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Commits

The new version differs by 6 commits.

  • 7173ab4 3.0.3
  • bda9500 Support { form: {}, json: true } use case
  • f5a38d4 Merge pull request #40 from feross/add-gitignore-update-travis
  • f72f88d restricted .gitignore to more relevant things
  • 813fd13 Make supported node version explicit in travis
  • 22a5f69 Added .gitignore

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Use a custom undici instance

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

👋🏻 Thank you for this awesome package! I was thinking that it could be useful to allow to pass a custom undici instance in addition to or as an alternative to the custom options already available. WDYT?

Motivation

The first 2 reasons that come to mind are:

  • use a mockedClient in tests
  • re-use an instance already initialised in the app

Example

No response

Suggestion: rewriteHeaders should take originalReq as argument

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Currently, rewriteHeaders - which rewrites response headers - only takes the headers object as an argument. Meanwhile, rewriteRequestHeaders - which rewrites request headers - takes an additional originalReq parameter which includes the request object.

I suggest that both methods should have the same signature.

Motivation

In a lot of use cases, we want to perform some kind of manipulation of response headers, where request context is relevant. For example, we may want to modify the location header in a redirect based on some external header passed in the request. This is not possible if the request object is not available.

Also, it just looks nicer if two symmetric methods have the same signature.

Example

No response

Reply.from inside async handler cause Promise may not be fulfilled with 'undefined' when statusCode is not 204

🐛 Bug Report

when using reply.from inside async request handler will cause error "Promise may not be fulfilled with 'undefined' when statusCode is not 204"

To Reproduce

Steps to reproduce the behavior:
the handle code:
/node_modules/fastify/lib/wrapThenable.js:30:30

 `server.get('/', async (req, reply) => {
            //...some code here
            reply.from('<other url>');
});`

Expected behavior

no error message

Your Environment

  • node version: 10
  • fastify version: 3.6.0
  • os: Mac
  • any other relevant information

NOT-AN-ISSUE: Upcoming fork

Hi fastify-reply-from team, just wanted to share with you this derivate/fork upcoming project:
https://github.com/jkyberneees/req-proxy

Although I properly mention this project as a base on fast-proxy fork, I also kept Matteo Collina in the license header. Can I kindly ask for your feedback here?

Best Regards

An in-range update of got is breaking the build 🚨

Version 9.2.0 of got was just published.

Branch Build failing 🚨
Dependency got
Current Version 9.1.0
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

got is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Release Notes v9.2.0

v9.1.0...v9.2.0

Commits

The new version differs by 19 commits.

  • aec95d2 9.2.0
  • f8af5b0 Update http-timer dependency to 1.1.0
  • e66a6b6 Fix Electron throwing HTTP trailers are not supported error (#598)
  • eedebc9 Add cookieJar option (#596)
  • ab0d24b Add "Bugs" to the comparsion table
  • 887f02d Improve code readability
  • a8eb41b Proper fix for #469 (#594)
  • 78a56ec Provide timings (#590)
  • bb8175b Correct the comparison table
  • 7910e14 Unify calling mergeOptions
  • 488ac7e Remove redundant code
  • 21bef3c Update readme.md (#593)
  • 267cb66 Document the response object (#592)
  • d0757da Add tests for stripping port in host header (#591)
  • dda1ce9 Use correct package/module wording in the readme

There are 19 commits in total.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Invalid content-type check

Hi team, I wanted to report that the following statement:

body = contentType.toLowerCase() === 'application/json' ? JSON.stringify(this.request.body) : this.request.body

Will fail in case of extended content-type formats like: application/json;charset=utf-8

Check should be:

body = contentType.toLowerCase().indexOf('application/json') === 0 ? JSON.stringify(this.request.body) : this.request.body

Can we fix this one?
Regards

Add an option to provide custom http/s agent

🚀 Feature Proposal

In the http options there is only an option to provide the http agent option.

Motivation

We need the ability to provide a custom http/s agent for example: agentkeepalive

Example

You can also pass a custom http agents. If you pass the agents, then the http.agentOptions will be ignored. To illustrate:

proxy.register(require('fastify-reply-from'), {
  base: 'http://localhost:3001/',
  http: {
    agents: { 
      'http:': new http.Agent({ keepAliveMsecs: 10 * 60 * 1000 }), // pass in any options from https://nodejs.org/api/http.html#http_new_agent_options
      'https:': new https.Agent({ keepAliveMsecs: 10 * 60 * 1000 })
               
    },
    requestOptions: { // pass in any options from https://nodejs.org/api/http.html#http_http_request_options_callback
      timeout: 5000 // timeout in msecs, defaults to 10000 (10 seconds)
    }
  }
})

Support using undici behind a proxy server

🚀 Feature Proposal

Add a field in undici options that specifies the proxy you need to connect to before forwarding the request to the upstream.

Motivation

Right now the only way to use the plugin behind a proxy server is to not use undici and instead use http while passing the agents option:

const ReplyFrom = require('fastify-reply-from');
const proxy = require('proxy-agent');
fastify.register(ReplyFrom, {
  http: {
     agents: {
       "http:": proxy('url'),
       "https:": proxy('url')
     }
  }
});

Recently undici has added support for proxies as mentioned in its documentation, so it would be great if we can support passing the proxy field as part of undici's options and not have to switch to http.

Example

const ReplyFrom = require('fastify-reply-from');
fastify.register(ReplyFrom, {
  unidici: {
    proxy: 'http://username:password@proxyIp:proxyPort'
  }
});

Rewrite request headers

Would you be open to a pull request for a new option to rewrite the request headers similar to your rewriteHeaders option? I am actually using the fastify-http-proxy module and I need to set the Host header to the original value for the upstream to accept the request.

More context: the upstream is Minio and it seems to base the signature on some header values, i.e. the Host, e.g. https://github.com/minio/cookbook/blob/master/docs/setup-nginx-proxy-with-minio.md#proxy-all-requests

Reply from fails when a query string is present in the source URL

🐛 Bug Report

URL's with a query string source fail to be proxied. Specifically this is being called from fastify-http-proxy but I believe this is a bug in fastify-reply-from.

To Reproduce

Steps to reproduce the behavior:

  1. Register fastify-reply-from
fastify.register(From, { 
  upstream: 'http://localhost:8000/test',
  rewritePrefix: '/test',
  base: 'http://localhost:8000/test', 
});
  1. In a route handler
reply.from('/test?some_query=1')
  1. The buildURL util throws an error, I believe in error. In this case
    dest.href is http://localhost:8000/test?some_query=1 and it does not start with http://localhost:8000/test/ (a trailing slash was appended even though its a non relative url.

Without knowing a ton about how the plugin is supposed to function it seems like checking for the presence of a query string on dest before trying to append a trailing slash would fix the issue.

Expected behavior

I would expect this not to throw an error, and the request to be proxied to the url with the query string maintained.

Your Environment

  • node version: 12.14.0
  • fastify version: 3.12.0
  • os: Mac
  • other info fastify-reply-from is being called by fastify-http-proxy 5.0.0 in this case.

Support load balancing across multiple base urls.

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Undici now ships BalancedPool, an utility to load balance across multiple upstreams.
We should support this in fastify-reply-from.

Motivation

No response

Example

No response

Allow disabling request logging

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

For each request the following is logged:

| 2022-02-17 22:16:41 | response received
| 2022-02-17 22:16:41 | fetching from remote server

We would like to disable these. Maybe it could even be linked to the fastify disableRequestLogging: true option.

Motivation

For every request we get 2 extra log lines. That adds up quickly, also if we keep them we would like them to be more descriptive.

Example

import replyFrom from 'fastify-reply-from'
...
app.register(replyFrom, {
      base: OUR_ENDPOINT,
      disableRequestLogging: true,
})

`rewriteRequestHeaders` unable to access request decorations

🐛 Bug Report

rewriteRequestHeaders unable to access request decorations

To Reproduce

Steps to reproduce the behavior:

fastify.decorateRequest('foo', null)
fastify.addHook('preHandler', async (request, reply) => {
  request.foo = 'Bar'
})
fastify.register(require('fastify-http-proxy'), {
  upstream: process.env.UPSTREAM,
  prefix: '/api',
  replyOptions: {
    rewriteRequestHeaders(originalReq, headers) {
      const modifiedHeaders = {
        ...headers,
        'x-foo': request.foo
      }
      return modifiedHeaders
    }
  }
})

Expected behavior

I expect that the request decorations would be accessible via rewriteRequestHeaders via the originalReq parameter.

Your Environment

  • node version: 12
  • fastify version: 2.13.1
  • os: Mac

Content-Type is set to application/octet-stream for requests with no body

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure it has not already been reported

Fastify version

3.15.1

Node.js version

14.5.0

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

10.15.7

Description

If target server replies with status code 204 and empty response body, content-type: application/octet-stream header is added by fastify to the response.
res.stream is passed to .send() here, which contains .pipe() method that makes fastify erroneously set the above mentioned header here

image

and as can be seen on the screenshot below, if target server is pinged directly, the response doesn't have content-type header at all, which is correct.

image

Steps to Reproduce

const Fastify = require('fastify')

const target = Fastify({
    logger: true
})

target.get('/', (request, reply) => {
    // added status 204 here
    reply.status(204).send();
})

const proxy = Fastify({
    logger: true
})

proxy.register(require('fastify-reply-from'), {
    base: 'http://localhost:3001/'
})

proxy.get('/', (request, reply) => {
    reply.from('/')
})

target.listen(3001, (err) => {
    if (err) {
        throw err
    }

    proxy.listen(3000, (err) => {
        if (err) {
            throw err
        }
    })
})

Expected Behavior

Content-Type header should not be set for responses with status code 204 since there's no actual content sent.

ignore queryString options when url has search property already

Hi there,

is there any reason to return url.search directly here? https://github.com/fastify/fastify-reply-from/blob/master/index.js#L117-L124

My use case is that I want to append an access_token to the url in the proxy, but it got ignored when original url has queries, could you please explain a bit here?

proxy.all('/*', (request, reply) => {
  reply.from(request.req.url, {
    queryString: {
      access_token: request.cookies.access_token
    }
  })
})

even I specify the queryString here, it doesn't work because the above code.

Thanks,
Vincent

An in-range update of fastify is breaking the build 🚨

The devDependency fastify was updated from 1.11.2 to 1.12.0.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

fastify is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Release Notes for v1.12.0

Enhancements

  • Fix missing object wrapper for errors in pino log - #1180
  • Complete support for Joi. Added test - #1178 #1179

Fixes

  • fix the logic of convert chunk segement into string - #1172

Typescript

  • Add overloads for { parseAs: "string" } or { parseAs: "buffer" } - #1162
  • added generic Query, Params, Headers, and Body types - #1160

Internals

  • Update genReqId test name - #1167

Documentation

  • Keep consistent es6 usage in readme example - #1181
  • added fastify-jwt-webapp - #1141
  • Update Ecosystem.md - #1170
  • Add @cemremengu to contributors - #1176
  • Correct typo in docs/Plugins-Guide.md - #1173
  • Add fastify-webpack-hmr community plugin to ECOSYSTEM.md - #1152
  • Updated README and tap runner - #1171
  • Error Handling Documentation - #1130
  • Update ContentTypeParser.md - #1157
  • Add fastify-vue-plugin to Ecosystem.md - #1151
  • Add fastify-loader to Ecosystem.md - #1154
  • Add badge of vulnerabilities from snyk.io - #1149
Commits

The new version differs by 20 commits.

  • dfec122 Bumped v1.12.0
  • 889f16d Fix missing object wrapper for errors in pino log (#1180)
  • f33b0b2 Keep consistent es6 usage in readme example (#1181)
  • 01687a8 added fastify-jwt-webapp (#1141)
  • cf7e76c Complete support for Joi. Added test (#1179)
  • 0afa858 Fix support for latest Joi (#1178)
  • c21932c Update Ecosystem.md (#1170)
  • 12b62d9 Add @cemremengu to contributors (#1176)
  • 5441280 fix the logic of convert chunk segement into string (#1172)
  • 64869a5 Correct typo in docs/Plugins-Guide.md (#1173)
  • 80f74c8 Updated README and tap runner (#1171)
  • 1b16a4c chore(package): update autocannon to version 3.0.0 (#1165)
  • 4b5f818 Error Handling Documentation (#1130)
  • 15fbe24 added generic Query, Params, Headers, and Body types (#1160)
  • c21bb9f Add overloads for { parseAs: "string" } or { parseAs: "buffer" } (#1162)

There are 20 commits in total.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

follow-redirect usage via Agents doesn't work with fastify-reply-from

🐛 Bug Report

It seems that using follow-redirects with fastify-reply-from doesn't work using the new Agent() mechanism.

To Reproduce

const Fastify = require('fastify').default
const { http, https } = require('follow-redirects')

const proxy = Fastify({
  logger: true,
})

proxy.register(require('fastify-reply-from').default, {
  http: {
    agents: { 'http:': new http.Agent(), 'https:': new https.Agent() },
  },
})

Using this setup the actual request-related code of follow-redirects is never called.

However I could get it to work using the following patch:

diff --git a/lib/request.js b/lib/request.js
index a202e4e9d0f6b2da852e33563cdb931eba5e6a76..81145c5206ee66c17cf5607384aa84cf2c3b70b5 100644
--- a/lib/request.js
+++ b/lib/request.js
@@ -1,7 +1,7 @@
 'use strict'
 const semver = require('semver')
-const http = require('http')
-const https = require('https')
+const http = require('follow-redirects/http')
+const https = require('follow-redirects/https')
 const querystring = require('querystring')
 const eos = require('end-of-stream')
 const pump = require('pump')

Your Environment

  • node version: 14.9.0
  • fastify version: 3.9.2
  • fastify-reply-from: 4.0.0
  • os: Mac & Linux

Proxying requests to multiple servers

Hi, I would like to proxy some requests to one host and some to another. Seems like it is impossible with current design. I can write code to support this feature, but want to discuss possible design first.

Add a hook to set the upstream base url based on the request data

🚀 Feature Proposal

Add a hook to set the upstream base url based on the request data

Motivation

Needed for a gradual rollout of services.

For example:
instance: http://localhost:3000/test
service A: http://localhost:3001/test
service B: http://localhost:3002/test

I'd like to set 20% of the requests to go to http://localhost:3001/test and 80% to go to http://localhost:3001/test

In this case the getUpstream will do a random of 20%/80% between the two

see PR: #157

Handle 503 and Retry-After header

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

After receiving a 503, we might want to retry sending that request after a short delay.

This should be done for GET requests.

Motivation

No response

Example

No response

Request type in onResponse option

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.24.1

Plugin version

6.4.1

Node.js version

16.x

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

20.04

Description

Request time in index.d.ts (https://github.com/fastify/fastify-reply-from/blob/master/index.d.ts#L36) (FastifyRequest) doesn't match passed argument in code (

onResponse(this.request.raw, this, res.stream)
) (Raw request)

Steps to Reproduce

I'm trying to pass replyOptions for https://github.com/fastify/fastify-http-proxy

const options: proxy.FastifyHttpProxyOptions = {
        upstream,
        prefix,
        rewritePrefix,
        config: route,
        proxyPayloads: false,
        preHandler: (req, reply) => { Object.assign(req.raw, { startedAt: hrtime.bigint() }) },
        undici: {
          headersTimeout: this.headersTimeout,
        },
        replyOptions: {
          onResponse: (req, reply, res) => {
            // req type is FastifyRequest, but real type is IncommingMessage
            console.log(req.raw) // undefined
            console.log('time to first byte:', hrtime.bigint() - req.raw.startedAt)  // cant read startedAt of undefined
            reply.send(res);
          },
        },
      };

Expected Behavior

First argument of onResponse should have correct type (IncommingMessage in index.d.ts or onResponse(this.request, this, res.stream) in code

Introducing request timeout

🚀 Feature Proposal

Right now, there's no request timeout option on fastify-reply-from. Although there's a workaround when using undici via undici.timeout option, this is not available on http & http2 request.

I will create a PR for this feature, if this is accepted.

Motivation

Having timeout on requests is essential on many cases.

Example

proxy.register(require('fastify-reply-from'), {
  base: 'http://localhost:3001/',
  requestTimeout: 30 * 1000
})

An in-range update of fastify is breaking the build 🚨

Version 1.11.2 of fastify was just published.

Branch Build failing 🚨
Dependency fastify
Current Version 1.11.1
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

fastify is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Release Notes v1.11.2

Internals

  • Handle promises in the error handler with the same logic of normal handlers - #1134
  • Rename ContentTypeParser - #1123
  • after should not cause inject() to be called - #1132

Documentation

  • Add trivikr@ to the collaborators list - #1139
  • Updated ecosystem doc - #1137
Commits

The new version differs by 13 commits.

  • 4e047a8 Bumped v1.11.2
  • c40ea62 Add trivikr@ to the collaborators list (#1139)
  • 0a27c92 Correct typos in Github Issue Template (#1140)
  • 5b18645 Updated ecosystem doc (#1137)
  • 0a874b9 Handle promises in the error handler with the same logic of normal handlers (#1134)
  • cce1a85 Rename ContentTypeParser (#1123)
  • 6d302a5 Add test for error fixed in mcollina/avvio#74 (#1132)
  • 60b85e7 Update Validation-and-Serialization.md (#1124)
  • d6982ea Remove/Merge redundant decorate functions (#1120)
  • baeebef Updated standard to v12. (#1121)
  • 7c8401d Update ContentTypeParser.js (#1122)
  • a14397d ecosystem in alphabetical order
  • 8a0c618 Update Ecosystem.md (#1125)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Always sending requests `application/json` if not a stream

} else {
body = JSON.stringify(this.request.body)
headers['content-length'] = Buffer.byteLength(body)
headers['content-type'] = 'application/json'
}

If a POST request is sent with a content type of text/plain and a body of this is plain text then the above branch will be hit and the outgoing request will have the content type header set to application/json and the request body will be JSON stringified.

  1. The body should not be JSON stringified unless the received payload is designated application/json
  2. It should pass along the received content type header as-is instead of rewriting it to application/json

An in-range update of nock is breaking the build 🚨

Version 9.6.0 of nock was just published.

Branch Build failing 🚨
Dependency nock
Current Version 9.5.0
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

nock is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Release Notes v9.6.0

9.6.0 (2018-08-08)

Features

  • Allow optionally() to be called with a value, specifying if the mock should be optional (d8a2606)
Commits

The new version differs by 2 commits.

  • d666949 Merge pull request #1177 from timrogers/master
  • d8a2606 feat: Allow optionally() to be called with a value, specifying if the mock should be optional

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Action required: Greenkeeper could not be activated 🚨

🚨 You need to enable Continuous Integration on all branches of this repository. 🚨

To enable Greenkeeper, you need to make sure that a commit status is reported on all branches. This is required by Greenkeeper because it uses your CI build statuses to figure out when to notify you about breaking changes.

Since we didn’t receive a CI status on the greenkeeper/initial branch, it’s possible that you don’t have CI set up yet. We recommend using Travis CI, but Greenkeeper will work with every other CI service as well.

If you have already set up a CI for this repository, you might need to check how it’s configured. Make sure it is set to run on all new branches. If you don’t want it to run on absolutely every branch, you can whitelist branches starting with greenkeeper/.

Once you have installed and configured CI on this repository correctly, you’ll need to re-trigger Greenkeeper’s initial pull request. To do this, please delete the greenkeeper/initial branch in this repository, and then remove and re-add this repository to the Greenkeeper App’s white list on Github. You'll find this list on your repo or organization’s settings page, under Installed GitHub Apps.

content-length has to be deleted for replyOptions.onResponse

🐛 Bug Report

When using reply.send(replacementResult) from replyOptions.onResponse the content-length header is prepopulated. This prevents sending of a longer result (I'm not sure what would happen with a shorter result).

To Reproduce

Steps to reproduce the behavior:

import {createReadStream} from 'fs';
import fastify from 'fastify';
import fastifyHttpProxy from 'fastify-http-proxy';

async function main() {
  const daemon1 = fastify();
  daemon1.get('/test', (_, reply) => {
    reply.header('x-accel-redirect', '/redirect-path');
    reply.send('');
  });
  await daemon1.listen();

  const upstream = `http://localhost:${daemon1.server.address().port}`;
  const daemon2 = fastify({logger: false});
  daemon2.register(fastifyHttpProxy, {
    upstream,
    prefix: '/',
    replyOptions: {
      onResponse: (request, reply, res) => {
        const destination = res.headers['x-accel-redirect'] ?? '';
        if (destination) {
          // XXX need to uncomment the following or it will fail
          // reply.removeHeader('content-length');
          return reply.send(createReadStream('package.json'));
        }

        reply.send(res);
      }
    }
  });

  daemon2.listen(8080);
}


main().catch(error => {
  console.error(error);
  process.exit(1);
});

Start this script then run curl -v http://localhost:8080/. With reply.removeHeader('content-length') commented out you will receive a 0 byte response.

Expected behavior

Removing the content-type header before sending the replacement reply causes the expected result (./package.json is transmitted in the response). I'm not sure if this should be automatic or if this need should be documented here or on fastify-reply-from. It took quite a lot for me to troubleshoot, specifically when I use reply.send('redir') from daemon1 I saw the first 5 characters of the response, this is when I realized the problem.

Your Environment

  • node version: 14
  • fastify version: 3.4.1
  • fastify-http-proxy: 4.0.4
  • fastify-reply-from: 3.3.0
  • os: Linux

Host header should use the host property instead of hostname

🐛 Bug Report

The host header should be <host>:<port> acording to MDN.
Right now the host header is obtained with new URL(source).hostname and it's missing the port (see here).
I think it should be obtained with new URL(source).host.

To Reproduce

rewriteRequestHeaders(originalReq, headers){
    console.log(headers.host)
}

reply.from('http://127.0.0.1:8080' , {
    rewriteRequestHeaders
});

Expected behavior

The header should be 127.0.0.1:8080 but instead it is 127.0.0.1

Your Environment

  • node version: 14.15.5
  • os: Mac

An in-range update of fastify is breaking the build 🚨

Version 1.11.1 of fastify was just published.

Branch Build failing 🚨
Dependency fastify
Current Version 1.11.0
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

fastify is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Release Notes v1.11.1
Commits

The new version differs by 10 commits.

  • e8ae197 Bumped v1.11.1
  • 45fb2f4 Revert "Log error after customErrorHandler (#1073)" (#1119)
  • 288d9ec Added eslint-import-resolver-node (#1118)
  • cef8814 Fix decorate{Request, Reply} not recognizing getter/setter config (#1114)
  • d99cd61 chore(package): update snazzy to version 8.0.0 (#1112)
  • f1007bb Add test for trust proxy with ip addresses (#1111)
  • da89735 Add test for trust proxy with number (#1110)
  • 4bcc1f6 Refactor trust-proxy tests (#1103)
  • ebee8d4 Augment types available for https server options (#1109)
  • 4bffcf9 Update Ecosystem.md (#1106)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

An in-range update of undici is breaking the build 🚨

Version 0.3.2 of undici was just published.

Branch Build failing 🚨
Dependency undici
Current Version 0.3.1
Type dependency

This version is covered by your current version range and after updating it in your project the build failed.

undici is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Commits

The new version differs by 2 commits.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Add onError hook

🚀 Feature Proposal
Same as the onResponse hook, please extend an onError hook

Motivation

Adding the onError hook is required incase you would like to override the default functionality
reply.code(code).send(error)
For example for adding logging or report to monitoring tools.

Example

Adding onError hook with logging and telemetric

reply.from(`http://localhost:${target.server.address().port}/`,
      {
        onError: (reply, code, error) => {
          my_alerting_system.counter('error', { code })
          logger.error(`request failed with error: ${error}`)
          reply.code(code).send(error)
        }
      })

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.