[Bug]: Prompt token count of 5677 exceeds max token count of 4095. #2404
Unanswered
SamWylde
asked this question in
Troubleshooting
Replies: 3 comments 1 reply
-
I’m not sure why the “Plugins” endpoint is giving you this issue but the “OpenAI” endpoint is working fine. I will look into it, but switching the endpoint is a “quick fix” |
Beta Was this translation helpful? Give feedback.
0 replies
-
i am using groq api and using llama 3 70b which has max tokens limit of 9000 but librechat gives me this error |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What happened?
Using any of the gpt-4 models, if the tokens exceed 4095, we get an error. Gpt-4 models should support 128,000 tokens.
"Something went wrong. Here's the specific error message we encountered: Prompt token count of 5677 exceeds max token count of 4095."
Steps to Reproduce
What browsers are you seeing the problem on?
Firefox
Relevant log output
No response
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions