Giter Site home page Giter Site logo

sidekiq-rate-limiter's Introduction

sidekiq-rate-limiter

Gem Version Test Status Coverage Status

Redis-backed, per-worker rate limits for job processing.

Compatibility

sidekiq-rate-limiter is actively tested against MRI versions 2.7 and 3.1.

sidekiq-rate-limiter works by using a custom fetch class, the class responsible for pulling work from the queue stored in Redis. Consequently, you'll want to be careful about using other gems that use the same strategy, sidekiq-priority being one example.

I've attempted to support the same options as used by sidekiq-throttler. So, if your worker already looks like this example I lifted from the sidekiq-throttler wiki:

class MyWorker
  include Sidekiq::Worker

  sidekiq_options throttle: { threshold: 50, period: 1.hour }

  def perform(user_id)
    # Do some heavy API interactions.
  end
end

Then you wouldn't need to change anything.

Installation

Add this line to your application's Gemfile:

gem 'sidekiq-rate-limiter'

And then execute:

$ bundle

Or install it yourself as:

$ gem install sidekiq-rate-limiter

Configuration

See server.rb for an example of how to configure sidekiq-rate-limiter. Alternatively, you can add the following to your initializer or what-have-you:

require 'sidekiq-rate-limiter/server'

Or, if you prefer, amend your Gemfile like so:

gem 'sidekiq-rate-limiter', require: 'sidekiq-rate-limiter/server'

By default, the limiter uses the name sidekiq-rate-limiter. You can define the constant Sidekiq::RateLimiter::DEFAULT_LIMIT_NAME prior to requiring to change this. Alternatively, you can include a name parameter in the configuration hash included in sidekiq_options

For example, the following:

  class Job
    include Sidekiq::Worker

    sidekiq_options queue: 'some_silly_queue',
                    rate: {
                      name:   'my_super_awesome_rate_limit',
                      limit:  50,
                      period: 3600, ## An hour
                    }

    def perform(*args)
      ## do stuff
      ## ...

The configuration above would result in any jobs beyond the first 50 in a one-hour period being delayed. The server will continue to fetch items from Redis, & will place any items that are beyond the threshold at the back of their queue.

Dynamic Configuration

The simplest way to set the rate-limiting options (:name, :limit, and :period) is to assign them each a static value (as above). In some cases, you may wish to calculate values for these options for each specific job. You can do this by supplying a Proc for any or all of these options.

The Proc may receive as its arguments the same values that will be passed to perform when the job is finally performed.

class Job
  include Sidekiq::Worker

  sidekiq_options queue: "my_queue",
                  rate: {
                    name:   ->(user_id, rate_limit) { user_id },
                    limit:  ->(user_id, rate_limit) { rate_limit },
                    period: ->{ Date.today.monday? ? 2.hours : 4.hours }, # can ignore arguments
                  }

  def perform(user_id, rate_limit)
    ## do something

Caveat: Normally, Sidekiq stores the sidekiq_options with the job on your Redis server at the time the job is enqueued, and it is these stored values that are used for rate-limiting. This means that if you deploy a new version of your code with different sidekiq_options, the already-queued jobs will continue to behave according to the options that were in place when they were created. When you supply a Proc for one or more of your configuration options, your rate-limiting options can no longer be stored in Redis, but must instead be calculated when the job is fetched by your Sidekiq server for potential execution. If your application code changes while a job is in the queue, it may run with different sidekiq_options than existed when it was first enqueued.

Motivation

Sidekiq::Throttler is great for smaller quantities of jobs but falls down a bit for larger queues (see issue #8). In addition, jobs that are limited multiple times are counted as 'processed' each time, so the stats balloon quickly.

TODO

  • While it subclasses instead of monkey patching, setting Sidekiq.options[:fetch] is still asking for interaction issues. It would be better for this to be directly in Sidekiq or to use some other means to accomplish this goal.

Contributing

  1. Fork
  2. Commit
  3. Pull Request

License

MIT. See LICENSE for details.

sidekiq-rate-limiter's People

Contributors

5minpause avatar anero avatar bwthomas avatar constxife avatar guymaliar avatar ibaaske avatar ivanovaleksey avatar lavaturtle avatar mlarraz avatar olleolleolle avatar packrat386 avatar rnubel avatar scottbartell avatar sentience avatar vinnyglennon avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sidekiq-rate-limiter's Issues

Errors when using with redis-rb > 4.1.0

When I install the latest versions of everything with bundle install, and then run the specs, I see these errors:

..F.F..

Failures:

  1) Sidekiq::RateLimiter::Fetch should retrieve work
     Failure/Error: lim.add(klass)
     
     Redis::CommandError:
       ERR value is not an integer or out of range
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:156:in `value'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:148:in `_set'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:75:in `block in finish'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `each'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `each_with_index'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `each'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `map'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `finish'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:98:in `finish'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/client.rb:164:in `block in call_pipeline'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/client.rb:306:in `with_reconnect'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/client.rb:162:in `call_pipeline'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis.rb:2462:in `block in multi'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis.rb:52:in `block in synchronize'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis.rb:52:in `synchronize'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis.rb:2454:in `multi'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis_rate_limiter-0.1.0/lib/redis_rate_limiter.rb:35:in `add'
     # ./lib/sidekiq-rate-limiter/fetch.rb:39:in `block in limit'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/sidekiq-5.2.7/lib/sidekiq.rb:97:in `block in redis'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:65:in `block (2 levels) in with'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `handle_interrupt'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `block in with'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `handle_interrupt'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `with'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/sidekiq-5.2.7/lib/sidekiq.rb:94:in `redis'
     # ./lib/sidekiq-rate-limiter/fetch.rb:33:in `limit'
     # ./lib/sidekiq-rate-limiter/fetch.rb:11:in `retrieve_work'
     # ./spec/sidekiq-rate-limiter/fetch_spec.rb:57:in `block (2 levels) in <top (required)>'

  2) Sidekiq::RateLimiter::Fetch should accept procs for limit, name, and period config keys
     Failure/Error: lim.add(klass)
     
     Redis::CommandError:
       ERR value is not an integer or out of range
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:156:in `value'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:148:in `_set'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:75:in `block in finish'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `each'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `each_with_index'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `each'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `map'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:74:in `finish'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/pipeline.rb:98:in `finish'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/client.rb:164:in `block in call_pipeline'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/client.rb:306:in `with_reconnect'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis/client.rb:162:in `call_pipeline'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis.rb:2462:in `block in multi'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis.rb:52:in `block in synchronize'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis.rb:52:in `synchronize'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis-4.1.1/lib/redis.rb:2454:in `multi'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/redis_rate_limiter-0.1.0/lib/redis_rate_limiter.rb:35:in `add'
     # ./lib/sidekiq-rate-limiter/fetch.rb:39:in `block in limit'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/sidekiq-5.2.7/lib/sidekiq.rb:97:in `block in redis'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:65:in `block (2 levels) in with'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `handle_interrupt'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `block in with'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `handle_interrupt'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `with'
     # /home/jacinda/.rvm/gems/ruby-2.5.3@sidekiq-rate-limiter/gems/sidekiq-5.2.7/lib/sidekiq.rb:94:in `redis'
     # ./lib/sidekiq-rate-limiter/fetch.rb:33:in `limit'
     # ./lib/sidekiq-rate-limiter/fetch.rb:11:in `retrieve_work'
     # ./spec/sidekiq-rate-limiter/fetch_spec.rb:93:in `block (2 levels) in <top (required)>'

Finished in 0.27527 seconds (files took 0.16476 seconds to load)
7 examples, 2 failures

Failed examples:

rspec ./spec/sidekiq-rate-limiter/fetch_spec.rb:54 # Sidekiq::RateLimiter::Fetch should retrieve work
rspec ./spec/sidekiq-rate-limiter/fetch_spec.rb:84 # Sidekiq::RateLimiter::Fetch should accept procs for limit, name, and period config keys

If I update the gemspec to pin redis to 4.1.0, the specs pass. But they fail with 4.1.1 or 4.1.2.

This matches some errors I've seen in production, so it's not just a spec issue.

Indicate when limit reached?

Hi, is there any way to register a handler for rate limit reached? e.g. so clients could log a warning/error or clear the queue.

Sidekiq 5.0.0 compatibility

Sidekiq 5 was released in April, but the gemspec of this project disallows that version. Is it really incompatible or can we expect a new release for this version?

Rate limits with ActiveJob & ActionMailer

Both ActiveJob and ActionMailer queue Sidekiq jobs, but they will be wrapped. This causes all of them to use the same rate limit rules. To help with this, the Sidekiq adapter includes in the payload a "wrapped" attribute pointing to the original class (sidekiq/sidekiq#2248). It seems this would be a useful improvement to use the wrapper when it's present.

What do you think?

Looks like this is no longer working

Hey guys,

It looks like this is no longer working on Sidekiq 3.2.1 & Rails 4.2.3

# app/workers/test_worker.rb
class TestWorker
  include Sidekiq::Worker
  sidekiq_options queue: :test_queue, throttle: {threshold: 20, period: 1.minute}

  def perform(options)
  end
end
# config/sidekiq.yml

---
:concurrency: 10
:pidfile: tmp/pids/sidekiq.pid
development:
  :concurrency: 2
:queues:
  - test_queue

Gem versions

  • sidekiq (3.2.1)
  • sidekiq-rate-limiter (0.1.0)
  • rails (4.2.3)

Spike showing way more throughput than should be there

It should only be doing 40 jobs/minute (2 workers x 20 jobs/minute) but it's just going full blast.

I haven't dug into this at all yet, but wanted to bring to your attention. I will report back if I find anything.

Rate limiting per worker?

Hi Guys,

We currently use Sidekiq-rate-limiter.
We experiencing that when we use mulitple differen workers (same queue and name) they all have their own limit.

So for example:
I start 50 times FirstWorker (limit 1 per second).
And start 50 time SecondWorker (limit 1 per second).

It will actually do 2 jobs per second.
Has anybody else have a solution for this?

Maintainance of this gem

This gem is very useful, however it does not seem to be maintained. There are many PR's to address compatibility with newer versions of Sidekiq, however a release has not been made that supports Sidekiq 6 for over a year and now there are outstanding PR's for support for Sidekiq 6.1.

Would the maintainer(@enova, @bwthomas ) be willing to let someone else maintain this gem? And would someone who has been maintaining a more up to date fork(@controlshift, @packrat386, @scottbartell ) be interested in releasing/maintaining this gem

Ability to limit the number of concurrent jobs

Hi, our use case is that we may have hundreds of Sidekiq workers. Users can add their site URL, we analyze the pages and provide recommendations. The problem is we can unintentionally DDoS websites due to the high number of workers that can run in parallel. We need to limit the number of concurrent jobs, e.g. with the same site_id argument.

Support for Sidekiq-Pro pausing

Sidekiq Pro allows for pausing of queues, but it seems like this functionality is dependent on using reliable_fetch, which is overridden by this gem. Might be worth either adding support for pausing or documenting the incompatibility with the pause feature.

High CPU consumption

When queue is banked up with no other jobs except throttled ones, CPU spikes to 99%. My guess is fetcher keeps on popping throttled jobs off the queue, but only to push back on since limit is reached.

We can't simply sleep until next rate limit cycle begins, since other non-throttled job can be queued and processed.

Any thought on how to prevent CPU spike as described?

Huge CPU consumption

When using this gem sidekiq processes uses almost 100% CPU whole time.
Configuration: Debian Jessie, Ruby 2.3, Rails 5.beta3, Sidekiq 4.1.1
I run 2 workers with 20 concurrency each.
I have 2 queues: default and limited
Only limited has rate option applied and it does not processing jobs whole time. That means for me that this gem affects even normal queues processing too.

Screenshot, but it tells not much :)
sidekiq-rate-limiter

Rate limiter based on termination time

Hi, does this gem permit an execution rate based on the termination time? Hi have the following need: i want to connect an internet service that allows only 4 request every minute, but I cannot send a new request if there is one that takes a very long time.

Rate limiter "slows down" over time under sustained workload

Hey guys,

I noticed that the rate limiter can "slow down" after running for a long time:

Sidekiq performance

You can see it was really spread out until I did this in a Ruby console:
irb(main):070:0> Sidekiq.redis {|c| c.del("sidekiq-rate-limit:FcWorker") }

The moment I did that, you see how the spikes became much more localized.

It's like the rate limiter "leaks" some how. This is a sidekiq worker running on Heroku. I'll continue to monitor the situation, but it's only been running for ~ 6 hours, so I'm expecting it to pop up again several times over the next couple days.

In this case, I have 500k queued with a 600/minute rate limit. Is this a known bug when you have a constant stream of jobs?

Oh and as I'm writing this, you can already see how it's spreading out over time:
Sidekiq performance #2

sidekiq 3.0.0 compatibility requested

17:57:14 web.1  | /Users/bmf/Projects/MyApp/vendor/bundle/ruby/2.1.0/gems/sidekiq-3.0.0/lib/sidekiq/fetch.rb:10:in `<class:Fetcher>': uninitialized constant Sidekiq::Fetcher::Util (NameError)
17:57:14 web.1  |   from /Users/bmf/Projects/MyApp/vendor/bundle/ruby/2.1.0/gems/sidekiq-3.0.0/lib/sidekiq/fetch.rb:9:in `<module:Sidekiq>'
17:57:14 web.1  |   from /Users/bmf/Projects/MyApp/vendor/bundle/ruby/2.1.0/gems/sidekiq-3.0.0/lib/sidekiq/fetch.rb:4:in `<top (required)>'
17:57:14 web.1  |   from /Users/bmf/Projects/MyApp/vendor/bundle/ruby/2.1.0/gems/sidekiq-rate-limiter-0.0.1/lib/sidekiq-rate-limiter/fetch.rb:2:in `<top (required)>'
17:57:14 web.1  |   from /Users/bmf/Projects/MyApp/vendor/bundle/ruby/2.1.0/gems/sidekiq-rate-limiter-0.0.1/lib/sidekiq-rate-limiter.rb:2:in `<top (required)>'

Limit Retry jobs as well.

Sidekiq rate limiter works great for enqueued jobs, however it would be great if we could limit jobs from the retry queue as well against all jobs.

Is there anyway to limit jobs from both enqueued queue and retry queue together?

Workers on named queue won't execute

I configured my worker like so:

sidekiq_options rate: { limit: 1, period: 2.seconds }

...And it executes fine in the default queue. However, if a queue name is added to this configuration:

sidekiq_options queue: 'myqueue', rate: { limit: 1, period: 2.seconds }

... the messages just sit there in the named queue, and Sidekiq doesn't execute them.

I also tried adding queues in sidekiq.yml, like so:

:queues:
  - default
  - myqueue

Any ideas? Thanks!

Compatibility with sidekiq-limit_fetch

I have a project currently using sidekiq-limit_fetch to manage concurrency limits on a per-queue basis. I have a new requirement that adds time as a dimension with which to throttle jobs, and I would like to use sidekiq-rate-limiter to manage this, but I'm wondering if there is a way to configure sidekiq-rate-limiter to limit concurrency of jobs, irrespective of a time-based limit? e.g., fetch jobs as quickly as you can, but only 3 workers may work on this kind of job at a time?

For clarity, I have SomeJob that may run as quickly as possible, but I only ever want three workers at a time running this job, and I have AnotherJob, for which concurrency doesn't matter, but I can only process this job at a rate of 1 per second.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.