An introduction to the backpropagation algorithm in Julia.
In Lecture-Intro-Backpropagation-p1.ipynb
we start with defining a Value and building a Computational Graph. We then go backward on this grad and compute the gradients at each step.
In Lecture-Intro-Backpropagation-p2.ipynb
we take a look at how we can go backward automatically, define a simple neuron and train it on a simple goal value.
It is based on Karpathy's nn-zero-to-hero and Microgram.jl, which is a port of micrograd to julia.