Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide the feature to save all model input/output to an external storage for debugging or replay later #317

Open
withsmilo opened this issue Aug 19, 2022 · 2 comments
Labels

Comments

@withsmilo
Copy link
Contributor

Hi, BentoML team.
This is a new suggestion for Yatai. In general, when serving a ML model, the input provided to the model and the output returned by the model are stored in an external storage, and used for debugging or replay. This is a feature required by most ML services that require " feedback ", and it would be good if BentoML / Yatai provides this as a default.

@ssheng
Copy link
Contributor

ssheng commented Aug 19, 2022

@withsmilo we are in the design phase of a model monitoring solution where we offer APIs for logging features and inference results and configurations for shipping the logs to a destination of choice. If possible, we can get on a call walk through our design with you and verify that it meets your requirements.

@withsmilo
Copy link
Contributor Author

@ssheng really great news! Thanks!

@yetone yetone added the feature label Nov 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants