Flash attention support for megatron.core.transformer.attention.SelfAttention
#1363
Unanswered
sinamoeini-amz
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi folks is there a plan for including flash attention support for
megatron.core.transformer.attention.SelfAttention
?Beta Was this translation helpful? Give feedback.
All reactions