Optimizers in JAX Toy implementations of the following deep learning optimization algorithms from scratch in JAX - Stochastic Gradient Descent Momentum Adagrad RMSProp Adam