Skip to content

Commit

Permalink
Add the position_imbeddings param to LlamaAttention.forward (#105)
Browse files Browse the repository at this point in the history
* fix: add position_embeddings args to LlamaAttention

a new arg here: https://github.com/huggingface/transformers/blame/main/src/transformers/models/llama/modeling_llama.py#L316

* feat: update transformers version

* style: fix incorrect style

---------

Co-authored-by: Alessandro Palla <[email protected]>
  • Loading branch information
nagic0 and alessandropalla authored Jul 25, 2024
1 parent f3b112a commit 2c8997b
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
numpy
torch
transformers>=4.39.3
transformers>=4.43.0
neural-compressor

0 comments on commit 2c8997b

Please sign in to comment.