How to use minio to serve the model? #2682
-
Is there any example using minio to serve model? I modify the tensorflow example to let it serve resnet50 using minio. I try to modify s3's endpoint url to use minio's cluster ip. Currently the container log shows:
Does it mean the model is not downloaded? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Tf is looking for a version sub-folder. you can download the model from tf example and and use that as reference.. https://github.com/kserve/kserve/tree/master/docs/samples/v1beta1/tensorflow
Refer below links on using minio with kserve: https://github.com/kubeflow/pipelines/blob/master/samples/contrib/pytorch-samples/mino-secret.yaml |
Beta Was this translation helpful? Give feedback.
-
@jagadeeshi2i I have been following the instruction. The pod is stuck in init which is the storage initializer. I dump the log of it and it doesn't print any error. Using the local mc client, I can also see the files are uploaded.
This is the S3 spec:
I'm not sure what's the problem. |
Beta Was this translation helpful? Give feedback.
Tf is looking for a version sub-folder.
you can download the model from tf example and and use that as reference..
https://github.com/kserve/kserve/tree/master/docs/samples/v1beta1/tensorflow
gsutil cp -r gs://kfserving-examples/models/tensorflow/flowers .
Refer below links on using minio with kserve:
https://github.com/kubeflow/pipelines/blob/master/samples/contrib/pytorch-samples/mino-secret.yaml
https://github.com/kubeflow/pipelines/blob/master/samples/contrib/pytorch-samples/Pipeline-Cifar10.ipynb