-
Notifications
You must be signed in to change notification settings - Fork 31
feat: better LLM response format #387
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: better LLM response format #387
Conversation
kyriediculous
commented
Dec 28, 2024
- Improve LLM responses (openAI compatible format for non-streaming responses)
- Use uuid for vLLM request generation
5e5da0e to
d780528
Compare
|
I don't know why exactly but I keep running into differently generated bindings than you guys. Might be a versioning thing. |
|
Reviewed the updates, some comments below:
|
|
@kyriediculous can you comment on these items above:
|
ad-astra-video
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Couple additional updates for change in response.
victorges
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code LGTM, but I haven't checked the OpenAPI schema to make sure the changes make it compatible. Only reviewed the changes here.
97af593 to
5d10a62
Compare
|
Approved! Will fix openapi gen in separate PR. |