Giter Site home page Giter Site logo

eddie007bkk / node-rate-limiter-flexible Goto Github PK

View Code? Open in Web Editor NEW

This project forked from animir/node-rate-limiter-flexible

0.0 0.0 0.0 1009 KB

Node.js rate limit requests by key with atomic increments in process Memory, Cluster or PM, Redis, MongoDb, etc.

License: ISC License

JavaScript 100.00%

node-rate-limiter-flexible's Introduction

Build Status Coverage Status npm version npm node version

Logo

node-rate-limiter-flexible

rate-limiter-flexible counts and limits number of actions by key and protects from DDoS and brute force attacks at any scale.

It works with Redis, process Memory, Cluster or PM2, Memcached, MongoDB, MySQL, PostgreSQL and allows to control requests rate in single process or distributed environment.

Atomic increments. All operations in memory or distributed environment use atomic increments against race conditions.

Traffic bursts. Replace Token Bucket with BurstyRateLimiter

Fast. Average request takes 0.7ms in Cluster and 2.5ms in Distributed application. See benchmarks.

Flexible. Combine limiters, block key for some duration, delay actions, manage failover with insurance options, configure smart key blocking in memory and many others.

Ready for growth. It provides unified API for all limiters. Whenever your application grows, it is ready. Prepare your limiters in minutes.

Friendly. No matter which node package you prefer: redis or ioredis, sequelize or knex, memcached, native driver or mongoose. It works with all of them.

In memory blocks. Avoid extra requests to store with inmemoryBlockOnConsumed.

It uses fixed window as it is much faster than rolling window. See comparative benchmarks with other libraries here

Installation

npm i --save rate-limiter-flexible

yarn add rate-limiter-flexible

Basic Example

const opts = {
  points: 6, // 6 points
  duration: 1, // Per second
};

const rateLimiter = new RateLimiterMemory(opts);

rateLimiter.consume(remoteAddress, 2) // consume 2 points
    .then((rateLimiterRes) => {
      // 2 points consumed
    })
    .catch((rateLimiterRes) => {
      // Not enough points to consume
    });

RateLimiterRes object

Both Promise resolve and reject return object of RateLimiterRes class if there is no any error. Object attributes:

RateLimiterRes = {
    msBeforeNext: 250, // Number of milliseconds before next action can be done
    remainingPoints: 0, // Number of remaining points in current duration 
    consumedPoints: 5, // Number of consumed points in current duration 
    isFirstInDuration: false, // action is first in current duration 
}

You may want to set next HTTP headers to response:

const headers = {
  "Retry-After": rateLimiterRes.msBeforeNext / 1000,
  "X-RateLimit-Limit": opts.points,
  "X-RateLimit-Remaining": rateLimiterRes.remainingPoints,
  "X-RateLimit-Reset": new Date(Date.now() + rateLimiterRes.msBeforeNext)
}

Advantages:

Middlewares and plugins

Some copy/paste examples on Wiki:

Migration from other packages

  • express-brute Bonus: race conditions fixed, prod deps removed
  • limiter Bonus: multi-server support, respects queue order, native promises

Docs and Examples

Changelog

See releases for detailed changelog.

Basic Options

  • points

    Default: 4

    Maximum number of points can be consumed over duration

  • duration

    Default: 1

    Number of seconds before consumed points are reset.

    Never reset points, if duration is set to 0.

  • storeClient

    Required for store limiters

    Have to be redis, ioredis, memcached, mongodb, pg, mysql2, mysql or any other related pool or connection.

Other options on Wiki:

Smooth out traffic picks:

Specific:

API

Read detailed description on Wiki.

Benchmark

Average latency during test pure NodeJS endpoint in cluster of 4 workers with everything set up on one server.

1000 concurrent clients with maximum 2000 requests per sec during 30 seconds.

1. Memory     0.34 ms
2. Cluster    0.69 ms
3. Redis      2.45 ms
4. Memcached  3.89 ms
5. Mongo      4.75 ms

500 concurrent clients with maximum 1000 req per sec during 30 seconds

6. PostgreSQL 7.48 ms (with connection pool max 100)
7. MySQL     14.59 ms (with connection pool 100)

Note, you can speed up limiters with inmemoryBlockOnConsumed option.

Contribution

Appreciated, feel free!

Make sure you've launched npm run eslint before creating PR, all errors have to be fixed.

You can try to run npm run eslint-fix to fix some issues.

Any new limiter with storage have to be extended from RateLimiterStoreAbstract. It has to implement at least 4 methods:

  • _getRateLimiterRes parses raw data from store to RateLimiterRes object.
  • _upsert inserts or updates limits data by key and returns raw data.
  • _get returns raw data by key.
  • _delete deletes all key related data and returns true on deleted, false if key is not found.

All other methods depends on store. See RateLimiterRedis or RateLimiterPostgres for example.

Note: all changes should be covered by tests.

node-rate-limiter-flexible's People

Contributors

animir avatar animirr avatar raelcun avatar mkxml avatar tero avatar ephemer avatar haroenv avatar zephraph avatar jdgriffith avatar michaeldzjap avatar ondronr avatar drye avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.