Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add the position_imbeddings param to LlamaAttention.forward (#105)
* fix: add position_embeddings args to LlamaAttention a new arg here: https://github.com/huggingface/transformers/blame/main/src/transformers/models/llama/modeling_llama.py#L316 * feat: update transformers version * style: fix incorrect style --------- Co-authored-by: Alessandro Palla <[email protected]>
- Loading branch information