This repo demonstrates how to take the basic model training code in src/mlops_demo/train_basic.py
and add MLOps tools to it to track experiments and package the resulting model into a deployable docker container. This is done using MLFlow and BentoML.
- Open https://github.com/radix-ai/mlops-meetup-demo in your browser.
- Click on Code and select Create codespace to start a Dev Container with GitHub Codespaces.
To track runs with MLFlow we need to run an MLFLow tracking server. The following command runs a local server.
mlflow server \
--backend-store-uri sqlite:///mlflow.db \
--default-artifact-root file:///app/mlflow/ \
--host 127.0.0.1
With the MLFlow server running run the following.
poetry run python src/mlops_demo/train.py
Register th latest model from MLFlow in BentoML so that it can be used by the BentoML service.
python /app/src/mlops_demo/register_model.py
We can run the service we defined in src/mlops_demo/service.py
and the iris_model
we registered in train.py
as follows.
bentoml serve src.mlops_demo.service:svc --reload --port=8000
This uses the service defined in src/mlops_demo/service.py
and the iris_model
we registered in train.py
to build a deployable docker image. Additional config is defined in bentofile.yaml
.
- Build a bento.
bentoml build
- Turn it into docker container.
bentoml containerize iris_service:latest