Use experimental_StreamData to send OpenAI finish_reason to client (response to issue #616) #663
Replies: 4 comments 3 replies
-
@peterdresslar thanks for posting this. I tried it and the result includes more data but doesn't match an
|
Beta Was this translation helpful? Give feedback.
-
@gtokman, just to clarify, are you saying that the data are different if you procure the stream by fetching the API using the the OpenAI node sdk instead of the bare fetch in the example? That would be somewhat interesting, though straightforward to adjust to, I think? |
Beta Was this translation helpful? Give feedback.
-
When I use the SDK directly in the response looks like this:
|
Beta Was this translation helpful? Give feedback.
-
Why is it so hard for the streaming response to be identical to what OpenAI gives you if you use the SDK directly? I mainly am looking for a solution like this so that I don't hard code API keys in my client and us Vercel as a proxy server. |
Beta Was this translation helpful? Give feedback.
-
Hello, I have been thinking about the discussion on issue #616 about sending metadata to the client from the OpenAI API Chat Completion Chunk object. It turns out that this is possible, but a little clunky, since it requires reading the stream for different reasons at the same time. At some point I am confident that the
AIStream
helpers will evolve to have more data richness, but in the meantime, this hack (and it is a hack!) does work.We take advantage of the ability to
tee
a response into streams. The happy news there is that the signature forOpenAIStream
accepts a rawResponse
, so we just have to convert type and everything runs smoothly with the tee'd stream. The other branch of the stream is a little rockier. In this implementation, we are taking advantage of the fact that the last chunk from OpenAI always(?) has the finish_reason and all the others have a null finish_reason. There are almost certainly more elegant ways to do this, but hopefully you get the idea.This example just fetches against the OpenAI API rather than using the Typescript library, but I think it would be straightforward to switch that.
This will always send a text message into your data array on the client, for you to process however you like. For an example client consumer you can check out the code here. (The example route is
openai-experimental-streamdata
)(Edited out many log statements)
Beta Was this translation helpful? Give feedback.
All reactions