Skip to content

Latest commit

 

History

History
25 lines (20 loc) · 922 Bytes

File metadata and controls

25 lines (20 loc) · 922 Bytes

SGD Comparison

Test various SGD algorithms on logistic regression and MLP, including

  • vanilla SGD
  • Momentum
  • Nesterov Accelerated Gradient
  • AdaGrad
  • RMSProp
  • AdaDelta
  • Adam
  • Adamax

The relation of these algorithms is shown in the following figure (my personal view). relation

This code is based on Theano, please install relative packages. The implementation of logistic regression and MLP is based on the Theano tutorial.

Test results

We measure the performance of these SGD algorithms by comparing the training curve and validation error.

Logistic Regression

LR

MLP

MLP

For more details about these algorithms, please refer to my blog (Chinese).