Multi-Layer and Recurrent Neural Networks from scratch using NumPy
- Explore and prepare CIFAR-10
- Methods for layer initialization, forward pass
- Numerical gradient descent with finite difference method.
- Mini-Batch gradient descent
- Visualize weights and feature map
- Tuning and analysing learning rate and regularization (weight decay)
- Implement and analyse the effects of Cyclic learning rate
- Coarse to fine random search
- Model with 50% accuracy
- Multi Layer implementation
- Analyse effects varying layers 3-9
- Batch Norm implementation
- Larger coase-to-fine hyperparameter search
- Prepare text from Harry Potter and the Goblet of Fire
- Implement sequence generation layer and methods
- Implement forward and backward pass
- Train with AdaGrad
- Analyse evolution of generated text
- Generate paragraphs from the best model
Best viewed by downloading the PDFs.