Skip to content

Investigate gradient-based optimisation algorithms #283

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
ben18785 opened this issue Mar 24, 2018 · 4 comments
Closed

Investigate gradient-based optimisation algorithms #283

ben18785 opened this issue Mar 24, 2018 · 4 comments
Labels

Comments

@ben18785
Copy link
Collaborator

You guys have probably thought about this but I was wondering about trying the following algorithms out in the optimisation side of things,

  • stochastic gradient descent with/without momentum or with Nesterov momentum,
  • adagrad,
  • RMSprop with/without Nesterov momentum,
  • Adam.

There are also second order methods that we could look into,

  • Newton's method,
  • conjugate gradients,
  • BFGS.

I have seen these all being used with statistical inference problems before and I wondered about their use.

@martinjrobins
Copy link
Member

+1 for the stochastic gradient descent methods, I've been wanting to implement some of these. The second order ones would be fairly trivial, just need to wrap scipy.optimise (scipy already in dependencies). I think the main design change would be allowing models to return gradients.

@MichaelClerx
Copy link
Member

Scipy has a bunch of them that we could wrap. None of them work for ion channel problems so I'm not super interested myself :-)

@MichaelClerx
Copy link
Member

See also #55 and #54

@MichaelClerx
Copy link
Member

Duplicate of #684

@MichaelClerx MichaelClerx marked this as a duplicate of #684 Feb 12, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants