Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No models work except the standard one - bad magic #205

Open
NeiroNext opened this issue Apr 5, 2023 · 6 comments
Open

No models work except the standard one - bad magic #205

NeiroNext opened this issue Apr 5, 2023 · 6 comments

Comments

@NeiroNext
Copy link

Does not work with any model other than the one that is attached in the description. I downloaded 4 pieces of 13b models, each time a bad magiс error, although the format is the same in the description of the model for alpaca, I downloaded 30b Model - bad magic, before the working 7b model I also had to download 5 non-working pieces.

Is this a bug, will it be fixed in the future, why are these models in the description suitable for alpace, but in reality they do not work, is this a bug in the new version? Throw off a working model, at least 13b, I already gave up, nothing works except the standard one.

@srevill
Copy link

srevill commented Apr 10, 2023

i'm having the same issue unfortunately, i've made no progress either. i'm trying to use ggml-vicuna-7b-4bit

@demian85
Copy link

where can I download those models? why is nothing specified in the docs?

@srevill
Copy link

srevill commented Apr 10, 2023

where can I download those models? why is nothing specified in the docs?

https://medium.com/@martin-thissen/vicuna-on-your-cpu-gpu-best-free-chatbot-according-to-gpt-4-c24b322a193a
I ended up figuring it out by following this guide, you have to run the python code that this mentions to get the models downloaded correctly.

@ASX320
Copy link

ASX320 commented Apr 13, 2023

I've got the same issue. I have 7B working via chat_mac.sh but it can't see other models except 7B. I've even tried renaming 13B in the same way as 7B but got "Bad magic". In other cases it searches for 7B model and says

"llama_model_load: loading model from 'ggml-alpaca-7b-q4.bin' - please wait ...
llama_model_load: failed to open 'ggml-alpaca-7b-q4.bin'
main: failed to load model from 'ggml-alpaca-7b-q4.bin'".

I also tried ./chat_mac -m alpaca-13b-ggml-q4_0-lora-merged/ggml-model-q4_0.bin but got the same "Bad magic error"

@NeiroNext
Copy link
Author

I've got the same issue. I have 7B working via chat_mac.sh but it can't see other models except 7B. I've even tried renaming 13B in the same way as 7B but got "Bad magic". In other cases it searches for 7B model and says

I also tried ./chat_mac -m alpaca-13b-ggml-q4_0-lora-merged/ggml-model-q4_0.bin but got the same "Bad magic error"

I found this check (bad magic) in the source code, removed it, but this did not fix the situation, unfortunately a special model is needed there.

@ASX320
Copy link

ASX320 commented Apr 14, 2023

Does not work with any model other than the one that is attached in the description. I downloaded 4 pieces of 13b models, each time a bad magiс error, although the format is the same in the description of the model for alpaca, I downloaded 30b Model - bad magic, before the working 7b model I also had to download 5 non-working pieces.

Is this a bug, will it be fixed in the future, why are these models in the description suitable for alpace, but in reality they do not work, is this a bug in the new version? Throw off a working model, at least 13b, I already gave up, nothing works except the standard one.

I found this file: af9ab4a
There are the links to download 7B,13B and 30B. That helped me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants