This is a simple python service template for training machine learning models asynchronously.
Example of how to handle training machine learning models asynchronously with Falcon, Celery, and message queues
Spin up the containers:
$ docker-compose up -d --build
Open your browser to http://localhost:8000/ping to view the app or to http://localhost:5555 to view the Flower dashboard.
Trigger a new task with hyperparameters (optional):
$ curl -X POST http://localhost:8000/create \
-d '{"alpha":0.5}' \
-H "Content-Type: application/json"
Check the status:
$ curl http://localhost:8000/status/<TASK_ID>
The model used in this repo is a general text classficiation model built using td-idf features and naive bayes model. But you can plug in any model of your own choice.
Everything in this toolkit is released under Twilio Labs and fully open-source. If you find any problems with this, please file an issue or even create a pull request to work together with us on the toolkit. We would love to hear your ideas and feedback!
MIT