From 2c8997bc99ddfc65022c215952679620c788564a Mon Sep 17 00:00:00 2001 From: Nagico2 Date: Thu, 25 Jul 2024 17:55:24 +0800 Subject: [PATCH] Add the position_imbeddings param to LlamaAttention.forward (#105) * fix: add position_embeddings args to LlamaAttention a new arg here: https://github.com/huggingface/transformers/blame/main/src/transformers/models/llama/modeling_llama.py#L316 * feat: update transformers version * style: fix incorrect style --------- Co-authored-by: Alessandro Palla --- requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements.txt b/requirements.txt index d22de26..14609e2 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,4 +1,4 @@ numpy torch -transformers>=4.39.3 +transformers>=4.43.0 neural-compressor \ No newline at end of file