You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Now that we have Optimization.jl, would it be possible to support a broad class of optimizers for MLJTuning by just supporting the Optimization.jl interface?
The text was updated successfully, but these errors were encountered:
That's possible. I'd be happy to support such an effort. I think it will be a bit of work.
At this point a lot of MLJ predictors do not have differentiable output. So if you're looking to apply optimisers that need this, you should know that is something yet to be sorted out (but worthwhile). Related: FluxML/MLJFlux.jl#220 .
Now that we have Optimization.jl, would it be possible to support a broad class of optimizers for MLJTuning by just supporting the Optimization.jl interface?
The text was updated successfully, but these errors were encountered: