You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, BentoML team.
This is a new suggestion for Yatai. In general, when serving a ML model, the input provided to the model and the output returned by the model are stored in an external storage, and used for debugging or replay. This is a feature required by most ML services that require " feedback ", and it would be good if BentoML / Yatai provides this as a default.
The text was updated successfully, but these errors were encountered:
@withsmilo we are in the design phase of a model monitoring solution where we offer APIs for logging features and inference results and configurations for shipping the logs to a destination of choice. If possible, we can get on a call walk through our design with you and verify that it meets your requirements.
Hi, BentoML team.
This is a new suggestion for Yatai. In general, when serving a ML model, the input provided to the model and the output returned by the model are stored in an external storage, and used for debugging or replay. This is a feature required by most ML services that require " feedback ", and it would be good if BentoML / Yatai provides this as a default.
The text was updated successfully, but these errors were encountered: