Getting error while loading meta-llama/Llama-2-7b-chat-hf
#3627
-
Hello All, I have fine tuned The code used for loading model:
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Hi @Shrijeeth, are you able to check if your HUGGING_FACE_HUB_TOKEN environment variable is set? This error usually happens when the token isn't set in the environment, which is required to get access to the Llama-2 suite of models from Meta. In case it isn't set, you can either do
in your shell/environment, or do something like:
And then try to load your model. Let me know if this fixes the issue! |
Beta Was this translation helpful? Give feedback.
Hi @Shrijeeth, are you able to check if your HUGGING_FACE_HUB_TOKEN environment variable is set?
This error usually happens when the token isn't set in the environment, which is required to get access to the Llama-2 suite of models from Meta.
In case it isn't set, you can either do
in your shell/environment, or do something like:
And then try to load your model.
Let me know if this fixes the issue!