Giter Site home page Giter Site logo

trekhleb / micrograd-ts Goto Github PK

View Code? Open in Web Editor NEW
61.0 4.0 3.0 5.51 MB

๐Ÿค– A TypeScript version of karpathy/micrograd โ€” a tiny scalar-valued autograd engine and a neural net on top of it

Home Page: https://trekhleb.dev/micrograd-ts

License: MIT License

TypeScript 100.00%
javascript machine-learning neural-networks neural-networks-from-scratch typescript

micrograd-ts's Introduction

Micrograd TS

This is a TypeScript version of karpathy/micrograd repo.
A tiny scalar-valued autograd engine and a neural net on top of it (~200 lines of TS code).

This repo might be useful for those who want to get a basic understanding of how neural networks work, using a TypeScript environment for experimentation.

Project structure

  • micrograd/ โ€” this folder is the core/purpose of the repo
    • engine.ts โ€” the scalar Value class that supports basic math operations like add, sub, div, mul, pow, exp, tanh and has a backward() method that calculates a derivative of the expression, which is required for back-propagation flow.
    • nn.ts โ€” the Neuron, Layer, and MLP (multi-layer perceptron) classes that implement a neural network on top of the differentiable scalar Values.
  • demo/ - demo React application to experiment with the micrograd code
    • src/demos/ - several playgrounds where you can experiment with the Neuron, Layer, and MLP classes.

Micrograd

See the ๐ŸŽฌ The spelled-out intro to neural networks and back-propagation: building micrograd YouTube video for the detailed explanation of how neural networks and back propagation work. The video also explains in detail what the Neuron, Layer, MLP, and Value classes do.

Briefly, the Value class allows you to build a computation graph for some expression that consists of scalar values.

Here is an example of how the computation graph for the a * b + c expression looks like:

graph-1.png

Based on the Value class we can build a Neuron expression X * W + b. Here we're simulating a dot-product of matrix X (input features) and matrix W (neuron weights):

graph-2.png

Out of Neurons, we can build the MLP network class that consists of several Layers of Neurons. The computation graph in this case may look a bit complex to be displayed here, but a simplified version might look like this:

graph-3.png

The main idea is that the computation graphs above "know" how to do automatic back propagation (in other words, how to calculate derivatives). This allows us to train the MLP network for several epochs and adjust the network weights in a way that reduces the ultimate loss:

training-1.gif

Demo (online)

To see the online demo/playground, check the following link:

๐Ÿ”— trekhleb.dev/micrograd-ts

Demo (local)

If you want to experiment with the code locally, follow the instructions below.

Setup

Clone the current repo locally.

Switch to the demo folder:

cd ./demo

Setup node v18 using nvm (optional):

nvm use

Install dependencies:

npm i

Launch demo app:

npm run dev

The demo app will be available at http://localhost:5173/micrograd-ts

Playgrounds

Go to the ./demo/src/demos/ to explore several playgrounds for the Neuron, Layer, and MLP classes.

Author

The TypeScript version of the karpathy/micrograd repo by @trekhleb

micrograd-ts's People

Contributors

cmhhelgeson avatar trekhleb avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

micrograd-ts's Issues

Visualize Test Predictions after Training

Currently DemoMLPTraining visualizes the final predictions of the model based on the training data alongside a graph showing the loss over time. However, there is no visualization of how the model performs on a range of test data after it has been trained.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.