-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using the hessian matrix in optimization #2437
Comments
One thought at the moment is you can subclass the class scipy_optimizer_hess(pyhf.optimize.scipy_optimizer):
def _get_minimizer(self, *args, hess=None, **kwargs):
return lambda *a, **k: scipy.optimize.minimize(*a, **k, hess=hess) and then you can use it like so: pyhf.set_backend(pyhf.tensorlib, scipy_optimizer_hess()) just to get a quick way of having it working for right now. This is probably something that needs a bit more thought as |
Thanks, yes, that can work. It's trickier than I first thought, though, because of how it must interact with fixed_params. |
Summary
Some optimization algorithms can make use of the hessian matrix, e.g., in scipy the methods Newton-CG, dogleg, trust-ncg, trust-krylov, trust-exact and trust-constr. The scipy API is like this:
The hessian is not passed through the
options
dict; it has it's own kwarghess
.It would be nice if pyhf passed the hessian automatically, since it is available cheaply with autodiff. I have tried passing it manually, but I can't get it to work. The interface with scipy minimize is like this
pyhf/src/pyhf/optimize/opt_scipy.py
Lines 93 to 102 in 5b63251
so you can't input an extra
hess
kwarg argument by passing extrasolver_options
.Additional Information
Code of Conduct
The text was updated successfully, but these errors were encountered: