Skip to content

Support deterministic algorithm configuration in PyTorch backend #8186

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
yhna940 opened this issue May 5, 2025 · 0 comments
Open

Support deterministic algorithm configuration in PyTorch backend #8186

yhna940 opened this issue May 5, 2025 · 0 comments

Comments

@yhna940
Copy link

yhna940 commented May 5, 2025

Is your feature request related to a problem? Please describe.
Some users require deterministic outputs for reproducibility, but the PyTorch backend does not currently expose a way to enable deterministic algorithms.

Describe the solution you'd like
Add a model configuration parameter to enable at::Context::setDeterministicAlgorithms(...) during model execution.

Describe alternatives you've considered
N/A

Additional context
I’ve opened a small PR to add this feature. Would appreciate a review: triton-inference-server/pytorch_backend#150

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

1 participant