Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementation of flash attention for native webgpu ep #22932
base: main
Are you sure you want to change the base?
Implementation of flash attention for native webgpu ep #22932
Changes from 24 commits
52656bf
ed8bf5d
75aa49d
58157c5
80296aa
c281f84
3d25852
b19070a
228b840
122e5f9
8b5fcc7
ed310de
ee1051f
d1d8175
6febf6c
ec59ba5
0dd5b70
8bcb47a
832c323
a814770
ab11009
563e662
ce2031e
bf1b146
d1e442e
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
Large diffs are not rendered by default.
Check warning on line 15 in onnxruntime/contrib_ops/webgpu/bert/flash_attention.h
GitHub Actions / Optional Lint C++
Check warning on line 36 in onnxruntime/contrib_ops/webgpu/bert/flash_attention.h
GitHub Actions / Optional Lint C++