Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the perplexity #225

Open
eljrte opened this issue Oct 13, 2024 · 0 comments
Open

Question about the perplexity #225

eljrte opened this issue Oct 13, 2024 · 0 comments
Labels
question Further information is requested

Comments

@eljrte
Copy link

eljrte commented Oct 13, 2024

I am Reproducing the results of perplexity. And I get confused about the following comment lines in examples/perplexity/perplexity.cpp:
// Download: https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-2-raw-v1.zip?ref=salesforce-research
// Run ./perplexity -m models/7B/ggml-model-q4_0.bin -f wiki.test.raw
// Output: perplexity: 13.5106 [114/114]

First, I can't open the website mentioned above, so I download the wikitext-2-raw-v1 from the hugginface.
Second, I didn't find the "models/7B/ggml-model-q4_0.bin",so I use ReluLLaMA-7B-PowerInfer-GGUF/llama-7b-relu.powerinfer.gguf instead, which I also download from hugginface SpareLLM
Thrid, I didn't get the perplexity 13.5106. I got the crazy "Final estimate: PPL = 1241.7730 +/- 8.93354" , which makes no sense at all.
I really appreciate your reply and tips. 3Q for the extraordinary work again.
By the way, i also want to reproduce the accuracy result using the 4 SuperGlue dataset. But I have no idea how to make use of the dataset. Can u provide some scripts or demo, etc?

@eljrte eljrte added the question Further information is requested label Oct 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant