-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Frameworks for HP optimization #55
Comments
To clarify, Hyperopt.jl is not related to the Python hyperopt. It uses different optimisation techniques (random search, latin hypercube sampling and Bayesian optimization) and deserves its own position in the list. However TreeParzen.jl is a direct port of the Python hyperopt to Julia, uses the same optimisation technique (tree-parzen estimators), and has the same behaviour. |
Maybe it is worth considering bandit frame works Ax |
Thanks. @vollmersj, added. Please let me know if you have other suggestions |
Hi, I'm really excited to see a Bayesian optimization method for hyperparameter tuning! I note that Thanks a lot for basically everything so far! |
Julia HP optimization packages:
Other HP optimization packages:
There are projects that benchmark different AutoML systems: https://openml.github.io/automlbenchmark/
From our conversation: JuliaAI/MLJ.jl#416 (comment)
I wanted to tell you guys about Optuna (repo & paper) a new framework for HP optimization.
A nice comparison w/ Hyperopt shows what can be done for HP visualization:
https://neptune.ai/blog/optuna-vs-hyperopt
Here are a few snips:
A 3 minute clip: https://www.youtube.com/watch?v=-UeC4MR3PHM
It would really be amazing for MLJ to incorporate this!
The text was updated successfully, but these errors were encountered: