Giter Site home page Giter Site logo

peytontolbert / candle Goto Github PK

View Code? Open in Web Editor NEW

This project forked from huggingface/candle

0.0 0.0 0.0 9.6 MB

Minimalist ML framework for Rust

License: Apache License 2.0

JavaScript 0.01% C++ 8.59% Python 0.75% Rust 83.73% Cuda 6.73% Makefile 0.02% HTML 0.16%

candle's Introduction

candle

Latest version Documentation License

Candle is a minimalist ML framework for Rust with a focus on easiness of use and on performance (including GPU support). Try our online demos: whisper, llama2.

let a = Tensor::randn(0f32, 1., (2, 3), &Device::Cpu)?;
let b = Tensor::randn(0f32, 1., (3, 4), &Device::Cpu)?;

let c = a.matmul(&b)?;
println!("{c}");

Check out our examples

Check out our examples:

Run them using the following commands:

cargo run --example whisper --release
cargo run --example llama --release
cargo run --example falcon --release
cargo run --example bert --release
cargo run --example bigcode --release

In order to use CUDA add --features cuda to the example command line.

There are also some wasm examples for whisper and llama2.c. You can either build them with trunk or try them online: whisper, llama2.

For llama2, run the following command to retrieve the weight files and start a test server:

cd candle-wasm-examples/llama2-c
wget https://karpathy.ai/llama2c/model.bin
wget https://github.com/karpathy/llama2.c/raw/master/tokenizer.bin
trunk serve --release --public-url /candle-llama2/ --port 8081

And then browse to http://localhost:8081/candle-llama2.

Features

  • Simple syntax, looks and feels like PyTorch.
  • CPU and Cuda backends, m1, f16, bf16.
  • Enable serverless (CPU), small and fast deployments
  • WASM support, run your models in a browser.
  • Model training.
  • Distributed computing using NCCL.
  • Models out of the box: Llama, Whisper, Falcon, StarCoder...
  • Embed user-defined ops/kernels, such as flash-attention v2.

How to use ?

Cheatsheet:

Using PyTorch Using Candle
Creation torch.Tensor([[1, 2], [3, 4]]) Tensor::new(&[[1f32, 2.], [3., 4.]], &Device::Cpu)?
Creation torch.zeros((2, 2)) Tensor::zeros((2, 2), DType::F32, &Device::Cpu)?
Indexing tensor[:, :4] tensor.i((.., ..4))?
Operations tensor.view((2, 2)) tensor.reshape((2, 2))?
Operations a.matmul(b) a.matmul(&b)?
Arithmetic a + b &a + &b
Device tensor.to(device="cuda") tensor.to_device(&Device::Cuda(0))?
Dtype tensor.to(dtype=torch.float16) tensor.to_dtype(&DType::F16)?
Saving torch.save({"A": A}, "model.bin") candle::safetensors::save(&HashMap::from([("A", A)]), "model.safetensors")?
Loading weights = torch.load("model.bin") candle::safetensors::load("model.safetensors", &device)

Structure

FAQ

Why Candle?

Candle stems from the need to reduce binary size in order to enable serverless possible by making the whole engine smaller than PyTorch very large library volume. This enables creating runtimes on a cluster much faster.

And simply removing Python from production workloads. Python can really add overhead in more complex workflows and the GIL is a notorious source of headaches.

Rust is cool, and a lot of the HF ecosystem already has Rust crates safetensors and tokenizers.

Other ML frameworks

  • dfdx is a formidable crate, with shapes being included in types preventing a lot of headaches by getting compiler to complain about shape mismatch right off the bat However we found that some features still require nightly and writing code can be a bit daunting for non rust experts.

    We're leveraging and contributing to other core crates for the runtime so hopefully both crates can benefit from each other

  • burn is a general crate that can leverage multiple backends so you can choose the best engine for your workload

  • tch-rs Bindings to the torch library in Rust. Extremely versatile, but they do bring in the entire torch library into the runtime. The main contributor of tch-rs is also involved in the development of candle.

Missing symbols when compiling with the mkl feature.

If you get some missing symbols when compiling binaries/tests using the mkl features, e.g.:

  = note: /usr/bin/ld: (....o): in function `blas::sgemm':
          .../blas-0.22.0/src/lib.rs:1944: undefined reference to `sgemm_' collect2: error: ld returned 1 exit status

  = note: some `extern` functions couldn't be found; some native libraries may need to be installed or have their path specified
  = note: use the `-l` flag to specify native libraries to link
  = note: use the `cargo:rustc-link-lib` directive to specify the native libraries to link with Cargo (see https://doc.rust-lang.org/cargo/reference/build-scripts.html#cargorustc-link-libkindname)

This is likely due to some missing linker flag that enable the mkl library. You can try adding the following at the top of your binary:

extern crate intel_mkl_src;

How to know where an error comes from.

You can set RUST_BACKTRACE=1 to be provided with backtraces when a candle error is generated.

candle's People

Contributors

laurentmazare avatar narsil avatar dogpawhat avatar gabrielmbmb avatar eltociear avatar leeese avatar philparzer avatar choisioo avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.