Skip to content

Add a benchmarking simulator for the distributed optimization setup #260

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
nabenabe0928 opened this issue Apr 9, 2025 · 0 comments
Open
Labels
contribution-welcome Contribution welcome issues new-package New packages

Comments

@nabenabe0928
Copy link
Contributor

nabenabe0928 commented Apr 9, 2025

Important

This feature could be integrated into OptunaHub rather than into OptunaHub Registry.

Motivation

It is important to compare optimization methods not only in the sequential setups, but also in the distributed setup, which is much more prevalent.
To this end, I would like to suggest to integrate the simulator introduced in Fast Benchmarking of Asynchronous Multi-Fidelity Optimization on Zero-Cost Benchmarks, which was accepted to AutoML'24.

Description

Since Optuna supports the ask-and-tell interface, a benchmarking simulator for the distributed optimization setup can easily be integrated.
For example, see the implementation here.

By integrating such a simulator, it becomes much simpler to benchmark samplers in the distributed optimization setup.

@nabenabe0928 nabenabe0928 added contribution-welcome Contribution welcome issues new-package New packages labels Apr 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contribution-welcome Contribution welcome issues new-package New packages
Projects
None yet
Development

No branches or pull requests

1 participant