Giter Site home page Giter Site logo

jo-osko / effect-handlers-bench Goto Github PK

View Code? Open in Web Editor NEW

This project forked from effect-handlers/effect-handlers-bench

0.0 2.0 0.0 213 KB

Benchmark repository of polyglot effect handler examples

License: MIT License

Dockerfile 17.83% Makefile 14.79% OCaml 15.27% Haskell 52.11%

effect-handlers-bench's Introduction

Effect handlers benchmarks suite

The project aims to build a repository of systems that implement effect handlers, benchmarks implemented in those systems, and scripts to build the systems, run the benchmarks, and produce the results.

A system may either be a programming language that has native support for effect handlers, or a library that embeds effect handlers in another programming language.

Quick start

Ensure that Docker is installed on your system. Then,

$ make bench_ocaml

runs the OCaml benchmarks and produces _results/ocaml.csv which contains the result of running the Multicore OCaml benchmarks.

Benchmark availability

Eff

Eff
Handlers in Action
Handlers in Action
Koka

Koka
libhandler

libhandler
libmpeff

libmpeff
Links

Links
Multicore OCaml
Multicore OCaml
N-queens
Counts the number of solutions to the N queens problem for board size N x N
Generator
Count the sum of elements in a complete binary tree using a generator
Tree explore
Nondeterministically explore a complete binary tree with additional state
Triples
Nondeterministically calculate triples that sum up to specified number
Simple counter
Repeatedly apply operation in a non tail position.

Legend:

  • ✅ : Benchmark is implemented
  • ❌ : Benchmark is not implemented
  • ➖ : Benchmark is unsuitable for this system, and there is no sense in implementing it (eg. benchmarking the speed of file transfer in a language that does not support networking)

Directory structure

  • systems/<system_name>/Dockerfile is the Dockerfile in order to build the system.
  • benchmarks/<system_name>/NNN_<benchmark_name>/ contains the source for the benchmark <benchmark_name> for the system <system_name>.
  • benchmark_descriptions/NNN_<benchmark_name>.md contains the description of the benchmark, the input and outputs, and any special considerations.
  • Makefile is used to build the systems and benchmarks, and run the benchmarks. For each system, the Makefile has the following rules:
    • sys_<system_name>: Builds the <system_name> docker image.
    • bench_<system_name>: Runs the benchmarks using the docker image for the <system_name>.
  • LABELS.md contains a list of available benchmark labels. Each benchmark can be assigned multiple labels.

Contributing

Benchmarking chairs

The role of the benchmarking chairs is to curate the repository, monitor the quality of benchmarks, and to solicit new benchmarks and fixes to existing benchmarks. Each benchmarking chair serves two consecutive terms. Each term is 6 months.

The current co-chairs are

Benchmark

If you wish to add a new benchmark goat_benchmark for system awesome_system,

  • Pick the next serial number for the benchmark NNN.
  • Add the benchmark sources under benchmarks/<awesome_system>/NNN_<goat_benchmark>, use the template provided in benchmark_descriptions/000_template.md.
  • Update the Makefile to build and run the benchmark.
  • Add a benchmark description under benchmark_description/NNN_<goat_benchmark>.md clearly stating the input, output and the expectation from the benchmark. Make sure you mention the default input argument for the benchmark.
  • Update this README.md file to add the new benchmark to the table of benchmarks and to the benchmark availability table.

If you wish to add a benchmark leet_benchmark that is not available for a system awesome_system but is available for another system

  • Use the same serial number for the benchmark NNN that is used by the existing system
  • Add the benchmark sources under benchmarks/<awesome_system>/NNN_<leet_benchmark>.
  • Update the Makefile to build and run the benchmark, using the same parameter as suggested in the benchmark description.

System

If you wish to contribute a system awesome_system, please

  • add a new dockerfile at systems/<awesome_system>/Dockerfile
  • add a new workflow under .github/workflows/system_<awesome_system>.yml
  • create a status badge for the new workflow and add it to to the availability table in lexicographic order.

Ideally, you will also add benchmarks to go with the new system and update the benchmark availability table.

Having a dockerfile aids reproducibility and ensures that we can build the system from scratch natively on a machine if needed. The benchmarking chair will push the image to Docker Hub so that systems are easily available for wider use.

We use Ubuntu 20.04 as the base image for building the systems and hyperfine to run the benchmarks.

Artifacts

We curate software artifacts from papers related to effect handlers. If you wish to contribute your artifacts, then please place your artifact as-is under a suitable directory in artifacts/.

There is no review process for artifacts (other than that they must be related to work on effect handlers). Whilst we do not enforce any standards on artifacts, we do recommend that artifacts conform with the artifacts evaluation packaging guidelines used by various programming language conferences.

effect-handlers-bench's People

Contributors

dhil avatar jo-osko avatar kayceesrk avatar slindley avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.