- Neural network architectures
- Multilayer perceptrons (aka feed-forward neural networks), calculating output
- Training multilayer perceptrons with backpropagation, gradient descent
- Read Mark Humphry’s notes on multi-layer neural networks and the backpropagation learning algorithm
- Read Andrew Trask’s articles on building a neural network in Python – part 1: optimization and part 2: gradient descent – Pay close attention to the diagrams in part 2 to gain an intuition for gradient descent
- Watch Alexander Ihler’s videos on neural networks, backpropagation training, and gradient descent
- Read the Asimov Institute’s beautiful article on architectures, The Neural Network Zoo
- Read Alexei Borissov’s notes on multilayer perceptrons and follow the backpropagation weight update example (slides 16-27)