From bef81089675ba0979ed005484397da57946807e2 Mon Sep 17 00:00:00 2001 From: Zizheng Yang <32321397+ZizhengYang@users.noreply.github.com> Date: Fri, 12 Apr 2024 13:45:48 +0800 Subject: [PATCH] Update README.md If use 2.3.6, there will be an error ImportError: /root/miniconda3/envs/handbook/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops9_pad_enum4callERKNS_6TensorEN3c108ArrayRefINS5_6SymIntEEElNS5_8optionalIdEE If we use the newest flash_attn version, there will be no trouble! --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 169c3ca8..7db5bf53 100644 --- a/README.md +++ b/README.md @@ -76,11 +76,11 @@ python -m pip install . You will also need Flash Attention 2 installed, which can be done by running: ```shell -python -m pip install flash-attn==2.3.6 --no-build-isolation +python -m pip install flash-attn --no-build-isolation ``` > **Note** -> If your machine has less than 96GB of RAM and many CPU cores, reduce the `MAX_JOBS` arguments, e.g. `MAX_JOBS=4 pip install flash-attn==2.3.6 --no-build-isolation` +> If your machine has less than 96GB of RAM and many CPU cores, reduce the `MAX_JOBS` arguments, e.g. `MAX_JOBS=4 pip install flash-attn --no-build-isolation` Next, log into your Hugging Face account as follows: