-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Fix Context length bug in local LLMs proposal #825
Labels
bug
Something isn't working
Comments
I also added this here, works like a charm to avoid gibberish: Although it may make the output sometimes not as accurate as GPT-4 Turbo: |
seems solved, therefore closing |
No its not solved, the code wasn’t fixed. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Version
VisualStudio Code extension
Operating System
Windows 10
What happened?
I found a potential fix for the context length bug where the llm keeps outputting gibberish. It still does so but does not get stuck on it now. Also increase the alpha_value to 3 and experiment with different n_batch values, like 1024 instead of 512 (increases the input length of context)The text was updated successfully, but these errors were encountered: