Neural net examples
In this repo you will find examples of neural networks implimented from scratch using my matrix library.
Examples included
perceptron
1.The perceptron algorithm. The activation function is a binary step function called "heaviside step function". Hinge loss as a loss function. Capable of binary classification. The example functions as an OR gate.
{
"layers": [2, 1],
"activation_function": ["heaviside"]
}
neural
2.Two layer neural network with gradient descent. Both layers use sigmoid as
activation function. XOR gate. Weights are initialized from a uniform
distribution U(-sqrt(6 / (in + out)), sqrt(6 / (in + out)))
(Xavier initialization).
{
"layers": [2, 3, 1],
"activation_function": ["sigmoid", "sigmoid"]
}
momentums
3.Same as previous except implemented the momentum method. It impovers training speed and accuracy (avoid getting stuck in a local minima). XOR gate
{
"layers": [2, 5, 1],
"activation_function": ["sigmoid", "sigmoid"]
}
exseimion
4.Handwritten digit classification is a multi-label classification problem. The data set used is semeion.data.
{
"layers": [256, 336, 10],
"activation_function": ["sigmoid", "softmax"]
}
minibatches
5.Same as previous, Data is split in small batches (subsets). Impoves memory efficiency, accuracy for a trade-off in compute efficiency. todo: avoid copying?
{
"layers": [256, 336, 10],
"activation_function": ["sigmoid", "softmax"]
}
DISCLAIMER: Only for learning purposes. Nim has its own machine learning framework Arraymancer as well as torch bindings.
Acknowledgments
License
This library is distributed under the MIT license.