Giter Site home page Giter Site logo

sush4nt / neural_networks Goto Github PK

View Code? Open in Web Editor NEW
0.0 1.0 0.0 345 KB

Creating a python only implementation of neural networks.

Home Page: https://sush4nt.github.io/neural_networks/

License: Apache License 2.0

Jupyter Notebook 59.96% CSS 0.53% Python 39.51%

neural_networks's Introduction

neural_networks

Creating a python implementation of neural networks.

It contains two classes:
1. NeuralNetwork: takes input and performs forward + backward propogation based on the loss functions defined.
2. Optimizers: contains Batch-Gradient-Descent and ADAM optimization methods to update network weights.

Credits: This repository is largely based on Ayush's work: repo
Much thanks!!

Install

pip install neural_networks

How to use

A. This is a module which provides a NeuralNetwork class to build a neural network.

Inputs:

  1. input layer dimensions
  2. activation functions for hidden layers + output layers. some of the valid activation functions are:
    1. "relu"
    2. "sigmoid"
    3. "tanh"
    4. "softmax"
  3. loss functions:
    1. "MSE"
    2. "CrossEntropyLoss"

This module also contains a Optimizer class which takes in the training data and optimizes the parameter weights based on the type of optimizer and loss function

# test run for binary classification problem:
np.random.seed(3)
print('Running a binary classification test')

data = datasets.make_classification(n_samples=30000,n_features=10,n_classes=2)
X = data[0].T
Y = (data[1].reshape(30000,1)).T

print("Input shape: ", X.shape)
print("Output shape: ", Y.shape)

#Generate sample binary classification data
net = NeuralNetwork(layer_dimensions=[10,20,1],
                    activations=['relu','sigmoid'])
net.cost_function = 'CrossEntropyLoss'
print(net)

#Optimize using standard gradient descenet
optim = Optimizer.gradientDescentOptimizer
optim(input=X,
      mappings=Y,
      net=net,
      alpha=0.07,
      epoch=200,
      lamb=0.05,
      print_at=100)
output = net.forward(X)

#Convert the probabilities to output values
output = 1*(output>=0.5)
accuracy = np.sum(output==Y)/30000
print('for gradient descent \n accuracy = ' ,np.round(accuracy*100,5))

neural_networks's People

Contributors

sush4nt avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.