Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix prompt length calculation #548

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

alxmiron
Copy link
Contributor

Current _buildMessages() calculates tokens in prompt in a wrong way - there's a small difference with a usage amount that comes from ChatGPT (in non-stream mode).
I fix it, picking the logic from here
Now our estimated numTokens should be equal to message.detail.usage.prompt_tokens

@zhujunsan
Copy link

Better to check the model and then calc. gpt3.5 and 4 seems calc differently.

@zhujunsan
Copy link

I also make the change in #546, but your code looks better 😀

@zhujunsan
Copy link

after test, at least in gpt-3.5, tokens_per_message is 5. I think it's one missing \n that's not calculated

@zhujunsan
Copy link

Is there anyone to merge or review this pr?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants