-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A conflict doc abount compatibility between Onnx Version & ML Opset Version #6134
Comments
This simply means ONNX Runtime takes that ONNX version as a dependency but has not implemented full support for the up-to-date opsets. Is running an image with a newer ONNX Runtime installed possible? |
for now I had to run this version due to some hardware/driver problem. I update my test table and just get more confused why onnxruntime==1.10 works with both onnx==1.10 and onnx==1.11 |
Ask a Question
Question
There is a conflict in official doc about the MLOpset version supported by Onnx==1.11
in ONNX Runtime compatibility - ONNX opset support
onnx==1.11 support ml-opset=2
in ONNX Versioning-
onnx==1.11 support ml-opset=3
Further information
I had to run a nvidia-tritonserver:21.10-py3 docker, with onnxruntime==1.9 built-in, and somehow when i start the server with the exported onnx model, i came up with an error like
Then i checked the doc mentioned above and found the conflict.
so I reinstall the onnxruntime via pip in my model-conversion-env and get following results
Notes
The text was updated successfully, but these errors were encountered: