Test various SGD algorithms on logistic regression and MLP, including
- vanilla SGD
- Momentum
- Nesterov Accelerated Gradient
- AdaGrad
- RMSProp
- AdaDelta
- Adam
- Adamax
The relation of these algorithms is shown in the following figure (my personal view).
This code is based on Theano, please install relative packages. The implementation of logistic regression and MLP is based on the Theano tutorial.
We measure the performance of these SGD algorithms by comparing the training curve and validation error.
For more details about these algorithms, please refer to my blog (Chinese).