-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Return usage when streaming chat completions. #1053
Comments
Thanks for the suggestion! This sounds like a feature request for the underlying OpenAI API and not the Python library, so I'm going to go ahead and close this issue. Would you mind reposting at community.openai.com? |
There is a community post for this here: https://community.openai.com/t/openai-api-get-usage-tokens-in-response-when-set-stream-true/141866 @rattrayalex Would you consider reopening this issue so we can subscribe to be notified when streaming usage has been added to the python client? Thanks This is a requested feature for the package I maintain jackmpcollins/magentic#74 |
Sorry, I can't guarantee that we'd remember to close this issue when the feature ships, so I'd rather not reopen. If we do remember, however, we'll leave a comment here so you'll be notified. |
For anyone looking in the future, this is now shipped https://x.com/openaidevs/status/1787573348496773423?s=46&t=jm46NyFc_ht8JJU4ZFVivw |
Confirm this is a feature request for the Python library and not the underlying OpenAI API.
Describe the feature or improvement you're requesting
Is it possible to return token usage when streaming in the final chunk?
otherwise we need to also use TikToken in an application where both streaming is enabled vs disabled.
Additional context
No response
The text was updated successfully, but these errors were encountered: