-
Notifications
You must be signed in to change notification settings - Fork 6
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
docs: extending the documentation and docstrings
- Loading branch information
Showing
43 changed files
with
549 additions
and
242 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,78 @@ | ||
Welcome to the Quick-Tune-Tool docs. | ||
# Quick-Tune-Tool | ||
|
||
For a quick-start, check out [examples](./examples/index.md) for copy-pastable snippets to start from. | ||
**A Practical Tool and User Guide for Automatically Finetuning Pretrained Models** | ||
|
||
> Quick-Tune-Tool is an automated solution designed to streamline the process of selecting and finetuning pretrained models across various machine learning domains. Built upon the Quick-Tune algorithm, this tool abstracts complex research-level code into a user-friendly framework, making model finetuning accessible and efficient for practitioners. | ||
--- | ||
|
||
|
||
## Installation | ||
```bash | ||
pip install quicktunetool | ||
# or | ||
git clone https://github.com/automl/quicktunetool | ||
pip install -e quicktunetool # Use -e for editable mode | ||
``` | ||
|
||
--- | ||
|
||
## Usage | ||
|
||
A simple example for using Quick-Tune-Tool with a pretrained optimizer for image classification: | ||
|
||
```python | ||
from qtt import QuickTuner, get_pretrained_optimizer | ||
from qtt.finetune.cv.classification import finetune_script | ||
|
||
# Load task information and meta-features | ||
task_info, metafeat = extract_task_info_metafeat("path/to/dataset") | ||
|
||
# Initialize the optimizer | ||
optimizer = get_pretrained_optimizer("mtlbm/micro") | ||
optimizer.setup(128, metafeat) | ||
|
||
# Create QuickTuner instance and run | ||
qt = QuickTuner(optimizer, finetune_script) | ||
qt.run(task_info, time_budget=3600) | ||
``` | ||
|
||
This code snippet demonstrates how to run QTT on an image dataset in just a few lines of code. | ||
|
||
--- | ||
|
||
## Contributing | ||
|
||
Contributions are welcome! Please follow these steps: | ||
|
||
1. Fork the repository | ||
2. Create a new branch (`git checkout -b feature/YourFeature`) | ||
3. Commit your changes (`git commit -m 'Add your feature'`) | ||
4. Push to the branch (`git push origin feature/YourFeature`) | ||
5. Open a pull request | ||
|
||
For any questions or suggestions, please contact the maintainers. | ||
|
||
--- | ||
|
||
## Project Status | ||
|
||
- ✅ Active development | ||
|
||
--- | ||
|
||
## Support | ||
|
||
- 📝 [Documentation](https://automl.github.io/quicktunetool/) | ||
- 🐛 [Issue Tracker](https://github.com/automl/quicktunetool/issues) | ||
- 💬 [Discussions](https://github.com/automl/quicktunetool/discussions) | ||
|
||
--- | ||
|
||
## License | ||
|
||
This project is licensed under the BSD License - see the LICENSE file for details. | ||
|
||
--- | ||
|
||
Made with ❤️ by https://github.com/automl |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,61 @@ | ||
# Index | ||
# Code References | ||
|
||
This section provides references for the core code components of Quick-Tune-Tool, detailing the primary modules and classes that make up the tool's architecture. The code is organized into three main parts: **Optimizers**, **Predictors**, and **Tuners**. | ||
|
||
--- | ||
|
||
## 1. Optimizers | ||
|
||
The Optimizers module is responsible for suggesting configurations for evaluation, using various optimization strategies. Available optimizers include: | ||
|
||
- **QuickTune Optimizer** | ||
- File: `optimizers/quick.py` | ||
- Implements the QuickTune algorithm, balancing multi-fidelity expected improvement with cost estimation to select configurations. | ||
|
||
- **Random Search Optimizer** | ||
- File: `optimizers/random.py` | ||
- Provides a basic random search optimizer as a baseline for comparison with other optimization strategies. | ||
|
||
--- | ||
|
||
## 2. Predictors | ||
|
||
The Predictors module includes components that estimate model performance and finetuning costs, enabling efficient configuration selection. | ||
|
||
- **Performance Predictor** | ||
- File: `predictors/perf.py` | ||
- Uses meta-learning to estimate the potential performance of a model configuration based on historical data and auxiliary task information. | ||
|
||
- **Cost Predictor** | ||
- File: `predictors/cost.py` | ||
- Evaluates the computational cost associated with different finetuning configurations, helping to balance resource efficiency with optimization goals. | ||
|
||
--- | ||
|
||
## 3. Tuners | ||
|
||
The Tuners module coordinates the tuning process, managing environment setup, experiment flow, and result handling. | ||
|
||
- **QuickTuner** | ||
- File: `tuners/quick.py` | ||
- Serves as the central class that manages the tuning process, integrating optimizers and predictors to manage iterative evaluations and updates. | ||
|
||
- **CV-Classification** | ||
- File: `tuners/cv_cls.py` | ||
- A specialized tuner for image classification tasks, offering a reduced interface where users simply provide the path to the image dataset. | ||
|
||
--- | ||
|
||
## Additional Resources | ||
|
||
- **Objective Functions** | ||
- Directory: `objective/` | ||
- Functions used to evaluate configurations, returning performance metrics for each step. | ||
|
||
- **Utility Scripts** | ||
- Directory: `utils/` | ||
- A collection of helper functions and utilities to support data processing, result logging, and other ancillary tasks. | ||
|
||
--- | ||
|
||
Refer to each module's in-code documentation for further details on function arguments, usage examples, and dependencies. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
# Overview | ||
|
||
The `Optimizer` class serves as a base class within the Quick-Tune-Tool, providing low-level functionality. It is designed to support flexible configuration management and interact with tuners during the optimization process. Key aspects of the class include directory setup, model saving, and interfacing methods for requesting and reporting trial evaluations. | ||
|
||
Here’s an overview of the [`Optimizer`][qtt.optimizers.optimizer] class: | ||
|
||
#### Core Methods | ||
|
||
- **`ask`**: Abstract method that must be implemented in subclasses. It requests a new configuration trial from the optimizer, returning it as a dictionary. Raises `NotImplementedError` if not overridden. | ||
|
||
- **`tell`**: Accepts and processes a report (result) from a trial evaluation. This method allows the optimizer to record outcomes for each configuration and adjust future suggestions accordingly. Supports both single and multiple trial reports. | ||
|
||
- **`ante`**: A placeholder method for pre-processing tasks to be performed before requesting a configuration trial (used by tuners). Can be overridden in subclasses for custom pre-processing. | ||
|
||
- **`post`**: A placeholder for post-processing tasks, executed after a trial evaluation has been submitted. Designed | ||
|
||
This class is intended to be extended for specific optimization strategies, with `ask` and `tell` as the primary methods for interaction with tuners. | ||
|
||
--- | ||
|
||
### Available Optimizers | ||
|
||
- [**`RandomOptimizer`**][qtt.optimizers.random] | ||
- [**`QuickOptimizer`**][qtt.optimizers.quick] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,31 @@ | ||
# Overview | ||
|
||
The `Predictor` class serves as a base class for implementing predictive models within the Quick-Tune-Tool. It provides core functionality for model setup, data handling, training, and persistence (saving/loading), allowing specific predictive models to extend and customize these methods. | ||
|
||
#### Core Methods | ||
|
||
- **`fit`** and **`_fit`**: | ||
- `fit`: Public method for training the model. It takes feature data `X`, target labels `y`, verbosity level, and any additional arguments. | ||
- `_fit`: Abstract method where specific model training logic is implemented. Models inheriting from `Predictor` should override `_fit` to implement their own fitting procedures. | ||
|
||
- **`preprocess`** and **`_preprocess`**: | ||
- `preprocess`: Wrapper method that calls `_preprocess` to prepare data for fitting or prediction. | ||
- `_preprocess`: Abstract method where data transformation logic should be added. Designed to clean and structure input data before model training or inference. | ||
|
||
- **`load`** and **`save`**: | ||
- `load`: Class method to load a saved model from disk, optionally resetting its path and logging the location. | ||
- `save`: Saves the current model to disk in a specified path, providing persistence for trained models. | ||
|
||
- **`predict`**:<br> | ||
Abstract method for generating predictions on new data. Specific predictive models should implement this method based on their inference logic. | ||
|
||
This `Predictor` class offers a foundation for different predictive models, providing essential methods for data handling, training, and saving/loading, with extensibility for custom implementations. | ||
|
||
--- | ||
|
||
#### Available Predictors | ||
|
||
- [**`PerfPredictor`**][qtt.predictors.perf] | ||
Predicts the performance of a configuration on a new dataset. | ||
- [**`CostPredictor`**][qtt.predictors.cost] | ||
Predicts the cost of training a configuration on a new dataset. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
# CV-Classification-Tuner | ||
|
||
::: qtt.tuner.cv_cls |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,19 @@ | ||
# Index | ||
# Overview | ||
|
||
The `QuickTuner` class is a high-level tuner designed to optimize a given objective function by managing iterative evaluations and coordinating with an `Optimizer`. It provides comprehensive functionality for logging, result tracking, checkpointing, and handling evaluation budgets. | ||
|
||
#### Core Methods | ||
|
||
- **`run`**: Executes the optimization process within a specified budget of function evaluations (`fevals`) or time (`time_budget`). This method iteratively: | ||
- Requests new configurations from the optimizer. | ||
- Evaluates configurations using the objective function `f`. | ||
- Updates the optimizer with evaluation results and logs progress. | ||
- Saves results based on the specified `save_freq` setting. | ||
|
||
- **`save`** and **`load`**: | ||
- `save`: Saves the current state of the tuner, including the incumbent, evaluation history, and tuner state. | ||
- `load`: Loads a previously saved tuner state to resume optimization from where it left off. | ||
|
||
#### Usage Example | ||
|
||
The `QuickTuner` is typically used to optimize an objective function with the support of an optimizer, managing configuration sampling, evaluation, and tracking. It is particularly suited for iterative optimization tasks where tracking the best configuration and logging results are essential. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
# QuickTuner | ||
|
||
::: qtt.tuner.quick |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,4 @@ | ||
"""A quick example of using QuickCVCLSTuner to tune vision classifiers on a dataset.""" | ||
from qtt import QuickCVCLSTuner | ||
tuner = QuickCVCLSTuner("path/to/dataset") | ||
from qtt import QuickTuner_CVCLS | ||
tuner = QuickTuner_CVCLS("path/to/dataset") | ||
tuner.run(fevals=100, time_budget=3600) |
Oops, something went wrong.