Skip to content

Toy implementations of popular deep learning optimizers from scratch in JAX

Notifications You must be signed in to change notification settings

NeelayS/optimizers_in_jax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

Optimizers in JAX

Toy implementations of the following deep learning optimization algorithms from scratch in JAX -

  • Stochastic Gradient Descent
  • Momentum
  • Adagrad
  • RMSProp
  • Adam

About

Toy implementations of popular deep learning optimizers from scratch in JAX

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published