Giter Site home page Giter Site logo

neural_rs's Introduction

๐Ÿง  Simple Neural Network in Rust ๐Ÿค–

Welcome to the neural_rs project implemented in Rust! ๐Ÿฆ€

This program demonstrates the fundamentals of a neural network with forward and backpropagation from scratch.

The code is inspired from this video in which the guy coded the same but in C language.

๐Ÿš€ Getting Started

Prerequisites

Ensure you have Rust installed. You can install Rust using rustup:

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

Installation

  1. Clone the repository:

    git clone https://github.com/Mario-SO/neural_rs/tree/main
    cd neural_rs
  2. Run the project:

    cargo run

๐ŸŽฏ Purpose

This simple neural network aims to classify XOR gate outputs ๐ŸŒ“. The network consists of:

  • An input layer with 2 neurons
  • One hidden layer with 2 neurons
  • An output layer with 1 neuron

๐Ÿ“ How It Works

Initialization

  • Weights and Biases: Initialized randomly between 0 and 1.

Forward Pass

  1. Hidden Layer: Calculates activations using input data, weights, and biases, then applies the sigmoid function.
  2. Output Layer: Computes the final output using hidden layer activations, weights, and biases, then applies the sigmoid function.

Backpropagation

  • Adjusts weights and biases to minimize the error between the predicted and actual outputs using the learning rate and derivatives of the sigmoid function.

๐Ÿ“Š Training & Accuracy

  • Trains the network over several epochs and prints the accuracy on the training set after training.
  • Displays final weights and biases after training is complete.

๐Ÿ“ˆ Example Output

Each training iteration shows:

  • Input Data
  • Predicted Output
  • Expected Output

After training, it prints the final accuracy, weights, and biases.

Example:

Input: [0.0, 0.0] -> Predicted output: [0.045] (Expected: [0.0])
Input: [0.0, 1.0] -> Predicted output: [0.912] (Expected: [1.0])
...
Final accuracy: 100.00%
Final hidden weights: [[0.2, 0.3], [0.4, 0.1]]
Final output weights: [[0.5, 0.7]]
...

๐Ÿ› ๏ธ Key Functions

  1. initialize_weights: Initializes weights randomly.
  2. sigmoid: Sigmoid activation function.
  3. sigmoid_derivative: Derivative of the sigmoid function for backpropagation.

๐Ÿค Contributions

Feel free to fork this repository and contribute by submitting a pull request. Please ensure all changes are well tested.


Happy coding! ๐ŸŽ‰

๐ŸŒŸ Star this repository if you found it helpful!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.