enable_opencl_throttling should be a boolean #24344
Answered
by
thoron
Mat2001mat
asked this question in
EP Q&A
-
I'm having troubles with GPU Inference and not sure what I'm missing. I'm creating the inference like this.
When I run inference, I get this:
I'm not sure how I can set this option. I've tried different options with |
Beta Was this translation helpful? Give feedback.
Answered by
thoron
Apr 15, 2025
Replies: 1 comment 3 replies
-
You might be able to work around this issue by explicitly setting the config value to something before adding the EP: var device = "GPU";
+ options.AddSessionConfigEntry("enable_opencl_throttling", "true"); // Or false - https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#summary-of-options
options.AppendExecutionProvider_OpenVINO(device);
inferenceSession = new InferenceSession(modelPath, options); |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I did some digging and found that the
AppendExecutionProvider
docs does not reflect actual usage and OpenVINO can be specified: https://onnxruntime.ai/docs/api/csharp/api/Microsoft.ML.OnnxRuntime.SessionOptions.html#Microsoft_ML_OnnxRuntime_SessionOptions_AppendExecutionProvider_System_String_System_Collections_Generic_Dictionary_System_String_System_String__