Giter Site home page Giter Site logo

the-benchmarker / web-frameworks Goto Github PK

View Code? Open in Web Editor NEW
6.9K 6.9K 642.0 136.61 MB

Which is the fastest web framework?

License: MIT License

Ruby 6.51% JavaScript 2.15% Crystal 2.15% Go 2.49% Rust 1.21% Elixir 1.35% Swift 0.91% Python 4.23% C# 0.70% Scala 0.88% Objective-C 0.37% Nim 0.82% CMake 0.51% C++ 0.95% Java 3.55% Lua 0.33% PHP 67.89% Dockerfile 2.44% C 0.49% Makefile 0.07%
benchmark framework http measurement performance standard web

web-frameworks's People

Contributors

an-tao avatar appleboy avatar chrislearn avatar cyrusmsk avatar dalisoft avatar dependabot-preview[bot] avatar dependabot[bot] avatar dependencies[bot] avatar dominikzogg avatar ethosa avatar fundon avatar grkek avatar jaguililla avatar kelvinst avatar kiliankoe avatar kpicaza avatar krishnatorque avatar panesofglass avatar paulcsmith avatar petersonfs avatar pyup-bot avatar renovate[bot] avatar shreyasjejurkar avatar stakach avatar sy-records avatar tanner0101 avatar tbrand avatar waghanza avatar whiplash avatar yurunsoft avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

web-frameworks's Issues

Node dependencies broken ?

Hi,

When I'm making a node project, either polka or express, I have

Error: Failed to replace env in config: ${NPM_API_KEY}
    at /usr/lib/node_modules/npm/lib/config/core.js:417:13
    at String.replace (<anonymous>)
    at envReplace (/usr/lib/node_modules/npm/lib/config/core.js:413:12)
    at parseField (/usr/lib/node_modules/npm/lib/config/core.js:391:7)
    at /usr/lib/node_modules/npm/lib/config/core.js:334:17
    at Array.forEach (<anonymous>)
    at Conf.add (/usr/lib/node_modules/npm/lib/config/core.js:333:23)
    at ConfigChain.addString (/usr/lib/node_modules/npm/node_modules.bundled/config-chain/index.js:244:8)
    at Conf.<anonymous> (/usr/lib/node_modules/npm/lib/config/core.js:321:10)
    at /usr/lib/node_modules/npm/node_modules.bundled/graceful-fs/graceful-fs.js:78:16
    at FSReqWrap.readFileAfterClose [as oncomplete] (fs.js:511:3)
TypeError: Cannot read property 'get' of undefined
    at errorHandler (/usr/lib/node_modules/npm/lib/utils/error-handler.js:205:18)
    at /usr/lib/node_modules/npm/bin/npm-cli.js:83:20
    at cb (/usr/lib/node_modules/npm/lib/npm.js:224:22)
    at /usr/lib/node_modules/npm/lib/npm.js:262:24
    at /usr/lib/node_modules/npm/lib/config/core.js:81:7
    at Array.forEach (<anonymous>)
    at Conf.<anonymous> (/usr/lib/node_modules/npm/lib/config/core.js:80:13)
    at Conf.f (/usr/lib/node_modules/npm/node_modules.bundled/once/once.js:25:25)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at Conf.emit (events.js:211:7)
    at Conf.add (/usr/lib/node_modules/npm/lib/config/core.js:337:10)
    at ConfigChain.addString (/usr/lib/node_modules/npm/node_modules.bundled/config-chain/index.js:244:8)
    at Conf.<anonymous> (/usr/lib/node_modules/npm/lib/config/core.js:321:10)
    at /usr/lib/node_modules/npm/node_modules.bundled/graceful-fs/graceful-fs.js:78:16
    at FSReqWrap.readFileAfterClose [as oncomplete] (fs.js:511:3)
/usr/lib/node_modules/npm/lib/utils/error-handler.js:205
  if (npm.config.get('json')) {
                 ^

TypeError: Cannot read property 'get' of undefined
    at process.errorHandler (/usr/lib/node_modules/npm/lib/utils/error-handler.js:205:18)
    at emitOne (events.js:116:13)
    at process.emit (events.js:211:7)
    at process._fatalException (bootstrap_node.js:374:26)

@tbrand @lukeed How can I avoid that ?

Regards,

Would anyone be a contributor?

I'm busy for other tasks recent days...
So it's hard to maintain this repository anymore.

Give me a comment who can

  1. build all frameworks included in this repository
  2. by neph

Of course I will back up when I can.
Thanks!

Failing at the make client step

While trying to run the benchmark locally, I keep running into this error at the make client step:

ld: can't open output file for writing: /Users/kevin/tmp/which_is_the_fastest/benchmarker/bin/client, errno=2 for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Error: execution of command failed with code: 1: `cc -o "/Users/kevin/tmp/which_is_the_fastest/benchmarker/bin/client" "${@}"  -rdynamic  -lz `command -v pkg-config > /dev/null && pkg-config --libs libssl || printf %s '-lssl -lcrypto'` `command -v pkg-config > /dev/null && pkg-config --libs libcrypto || printf %s '-lcrypto'` -lpcre -lgc -lpthread /usr/local/Cellar/crystal-lang/0.22.0/src/ext/libcrystal.a -levent -liconv -ldl -L/usr/lib -L/usr/local/lib`

My system is Mac OS 10.12.4 (sierra)
Could I be missing something on my system?

Decentralizing repositories

Hi,

We have actually 34 framework to compare.

However, maintaining 34 different tools in 1 place could be complicated.

The purpose is to have one repo per framework, in order to :

  • Focus, here, in gathering / running / displaying result, instead of maintaing each frameworks
  • We could have CI for each frameworks, badges ... (more production-ready)
  • Each implementations could be more complex / maintained separated

@tbrand @OvermindDL1 what do you think ?

Regards,

Flask server be crashed during the benchmarking

This command will be crashed.

./bin/benchmarker flask

However, executing it manually is success.

server

./bin/server_python_flask

client

curl localhost:3000
curl localhost:3000/user/3
curl -XPOST localhost:3000/user

CI

We need CI to check the PR building and working correctly.
The result is not affected to the result of this benchmark since #27

Add django

Hi,

Django could be considered as a python alternative to Rails.

It will be interseting to compare Django to Rails as flask could be compared to sinatra.

Regards,

Parallelism

Some of these languages provide parallelism and others don't.

For Crystal, if you have a multicore system, you need to run multiple processes (1 per core) with reuse port enabled to take advantage of them.

Provide specs

Can you provide specs for your benchmarks?

  • version of each language and framework
  • operating system specs
  • machine specs - memory, cpu, etc
  • all flags used to compile the application
  • how each application was launched

Benchmarking test - statistical results

So my new server is setup, nice and empty and ready to run tests on (for now), so I'm playing with it. :-)

8-core (16-hyperthreads), 3.7ghz, 16gigs ram (for now), etc... etc...

First, the current result sets that are currently built into this git project are not indicative of the actual through-put that a server can sustain as it always tests a simple iteration of command and how fast they complete rather then how long each request takes, the average, the longest a request took, etc... etc...

So I whipped up a quick script to test the statistical parts in far far greater detail than this current git HEAD does, these are the results for a set of servers. I used rust, crystal, go, python, elixir, and node for the servers, mostly because I haven't installed (or figured out how, depending) the rest of the stuff, plus these run the fastest overall. I did have an issue with server_python_japronto, it refused to run... The server_python_flask did not install flask, I had to do that manually, that should be fixed... japronto does not seem to be in pip, so it remained missing. I'll post the results here as I get them (it outputs markdown format so the posts will be a run with a bit of description at the top of each).

Swift?

What's your stance on including swift frameworks here? The downside I have in mind is that the list of supported platforms is still rather limited. Things should however work on macOS and Ubuntu at the least, since those are officially supported.

The most popular (bigger) frameworks are

Add resources consumption

One library can have an awesome performance for the cost of ten times bigger CPU consumption than other which has like 2 times worse performance. I think it is important metrics.

Metrics can be normalized to something like "10 requests per 1%CPU or 1MB Ram" (just idea :)

Debating about an official build tool ...

Hi,

I can see that their are 3 ways for building :

  • make
  • neph
  • docker

I personnaly using make (cause it's easy and I'm lazy).

@tbrand I can see that are building using neph

Shouldn't we build using the same thing ?

Behind that, there is the idea of a cross-platform solution -> building objectif c, dot net, and else using the same toolchain / isolation level.

For me docker is a good idea, but for linux enable tools (no asp no swift ...).

I suggest we use Vagrant (cause there is OS X, Windows ...).

@tbrand @OvermindDL1 what do you think about that ?

Regards,

Add N20 (Erlang)

Hey @5HT

You have an interest in adding your project in this analysis?
it would be very interesting =)

Replace the client with a proper seiger

The current client is not able to saturate servers properly (at least locally here with 6 native cores), which causes the results of the fastest servers to not be accurate in relation to each other, it should perhaps be replaced with a proper http siege engine such as wrk or httpsiege, which also have a lot more configuration and testing options. wrk tends to be more common. Thoughts? Would a PR be accepted for this and if so what siege engine should be used?

If one is swapped I'd probably propose using a siege engine(s) that are capable of:

  • HTTP1.0
  • HTTP1.1
  • HTTP2.0
  • Websocket

Perhaps also testing:

  • Response data for consistency
  • Headers so we can test setting headers
  • HTTP Streaming
  • Error rates:
    • Test proper error handling for invalid requests.
    • Test how the webserver fails when it is saturated, such as delaying responses, dropping connections, falling over in death, etc...
  • Etc...

Any other ideas? Any of these should be left out?

Saving results to the git repo might be useful as well, organized by system the results were run on, at what git version, at what date, etc..., perhaps with scripts to generate result graphs automatically.

In addition it could use proper servers for testing, I have a 6-native-core Linux server that I can donate for such testing. I might be able to spool up sessions on a 24-core server but it is not 'empty' by any stretch so it would skew results compared to the empty 6-core server. I have a 4-native core (8-HT) Linux server on the same network interlink that I can siege 'from' to the 6-core server as well to test it over an network as well so the siege process itself does not interfere with the actual servers.

Different benchmark results

Hello can I ask why before some days iris was the fastest but on your last update you change its results, as far as I know Iris is still the same, so what's changed?

At update: 2017-09-04

70ac42b

| go | echo | 4.628243 | 4.362947 | 4.466652 |
| go | gorilla_mux | 4.524554 | 4.057697 | 4.336543 |
| go | iris | 4.573527 | 4.239478 | 4.474351 |

Now ( Last update: 2017-09-05 )

| go | echo | 4.243314 | 4.035731 | 4.175869 |
| go | gorilla_mux | 4.001856 | 3.807524 | 3.892316 |
| go | iris | 4.475054 | 4.182971 | 4.271502 |

1 day back iris was faster than echo, one day after results have been touched

A note: regex is slower than trie algorithms for routing, so I'm also curious why your benchmarks results shows gorilla mux faster than iris and echo, any go developer knows that gorilla mux is almost 40 times slower than iris, echo and gin...

Revelant to test in PHP ?

Hello,

I've seen there is a lot of languages on this benchmark, but can't see any PHP frameworks.

Is there any reasons not to ?

Regards,

PS : If it is revelant, I can propose a PR 😜

Framework request: hyper 0.11 (with Tokio)

I'm filing an issue rather than sending a PR here because hyper 0.11 isn't quite out, but is close.

Currently, all the Rust frameworks are based on synchronous I/O. Hyper 0.11 will include Tokio support, which should put up even better numbers.

I'm happy to actually send a PR, either if "using hyper master" is okay, or once it's released.

Display version after language

Hi,

When I read the README.md, I can see that many languages are tested.

I have the indication of the running OS version.

However, I have no information about version of language.

For example, benchmarks could differs between ruby 2.0 and ruby 2.4.

I think it will be useful to add language version just after language name in README.md results.

Regards,

Invalid benchmarks

Hello, I've seen your try to make benchmark tests on different type of routers and web frameworks (which is not the same thing , is like comparing apples with cars, but this is your own choice) but this is not a fair benchmark because some web frameworks running from dynamic ipv6/or ipv4 address and some others from static ipv4.

Take for example the the go/gorillamux vs iris vs echo,
go/gorillamux uses regexp while iris uses one of the fastest algorithms for routing (it's well known for that), iris is battle-tested on different type of web applications already, however your benchmark results shows gorilla mux overperform iris and echo, this is not possible because golang's regexp is the slowest method see here http://benchmarksgame.alioth.debian.org/u64q/regexredux.html

EDIT:

Also on Iris you store the path parameter before send it to the client while in echo you send it without storing it to variable, please fix that too.

Thank you!

Add some routes that return more data

Some web frameworks seem to work better w/ more or less data packet sizes.

Please add some routes that return more data; possibly just plain text. e.g.:

  • existing routes [almost no data]
  • small routes [e.g.: N bytes]
  • medium routes [e.g.: 10 * N bytes]
  • large routes [e.g.: 100 * N bytes]

Missing Iris from go list but shown on benchmarks - is that possible?

Hey @tbrand sorry for the noise but I've just tried to reproduce your results and I couldn't because you show results of some libraries that are not even exist in this repository, how you do that?

Let me explain,

I've seen Iris in the list of your benchmark list but was never ran because you didn't include that go framework in the go list, so why you include iris in your benchmark results?

The following link proofs that I'm saying to you people: https://github.com/tbrand/which_is_the_fastest/blob/3ef78e76b405b9af1b23f61a9375cc23ee4fad79/Makefile#L73 November of 2017 -branch=master). It proofs that iris was never included in the benchmarks, so the results we see at the homepage of this project cannot be trusted. shows your current master branch.

Please fix that immediately and update your results, keep note that we all know that go's regexp are slower than trie algorithms so fix iris and echo (both iris and echo are faster than gorilla mux, iris should be shown as the fastest one, among the frameworks built on top of the standard net/http package).

My previous issue about invalid benchmarks: #71

Add beego (Go)

Hi,

Beego could be considered as a go alternative to Rails.

It will be interseting to compare Beego to Rails as Kemal could be compared to Sinatra.

Regards,

Benchmarker is reporting up to date when it is not even built

I keep getting this, always, with every PR and every version, even when there is no benchmarker:

╭─overminddl1@snip ~/tmp/which_is_the_fastest  ‹asptest*›
╰─➤  make benchmarker
make: 'benchmarker' is up to date.

However if I run the commands directly then it builds fine:

╭─overminddl1@snip ~/tmp/which_is_the_fastest  ‹asptest*›
╰─➤  cd benchmarker; crystal build src/benchmarker.cr -o bin/benchmarker --release; cd ..
╭─overminddl1@snip ~/tmp/which_is_the_fastest  ‹asptest*›
╰─➤  ln -s -f ../benchmarker/bin/benchmarker bin/.

Everything else makes just fine though...

Running on a public cloud ?

Hi,

I found this project very useful, at least an element of my technical choices.

I think that running all tests on one public cloud

  • AWS
  • DigitalOcean
  • ...

Could help / lead in technical choices ?

What do you think about that ?

Regards,

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.