-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
token length issue #41
Comments
The token limits include both the prompt and the response together, there is no way to make gpt-4 process more than 8192 tokens (gpt-4-32k can do 32768). Theoretically, we could trim the beginning of the first message, but that might not be desirable either, because that context at the beginning would be lost. |
ok so is this also the case for ChatGPT though? I noticed that I ran into the token issue e.g. for chatbotui.com but I feel like I never really ran into this limitation on chat.openai.com, or are they just really good at not making that clear to the user? |
I use gpt4 and sometimes when my message is pretty long and I start running into this error below once, but I then adjust my input so that it is within he token length of 8192, the prompt will go through but it'll then not produce a very long response, e.g here and resurface that error
happened to me twice now.
The text was updated successfully, but these errors were encountered: