Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add flag to disable EventSourceReponse ping messages #1256

Closed
khimaros opened this issue Mar 5, 2024 · 0 comments · Fixed by #1257
Closed

add flag to disable EventSourceReponse ping messages #1256

khimaros opened this issue Mar 5, 2024 · 0 comments · Fixed by #1257

Comments

@khimaros
Copy link
Contributor

khimaros commented Mar 5, 2024

Is your feature request related to a problem? Please describe.
some OpenAI based software such as ztjhz/BetterChatGPT#538 do not gracefully handle ping messages. this prevents them from working reliably with llama-cpp-python[server]

Describe the solution you'd like
add a flag to llama-cpp-python[server] to disable ping messages, for use with apps which are not familiar with : prefixed comments.

Describe alternatives you've considered
could disable all ping messages in llama-cpp-python[server] but this may break some other use cases.

abetlen pushed a commit that referenced this issue Apr 17, 2024
for backward compatibility, this is false by default

it can be set to true to disable EventSource pings
which are not supported by some OpenAI clients.

fixes #1256
xhedit pushed a commit to xhedit/llama-cpp-conv that referenced this issue Apr 30, 2024
for backward compatibility, this is false by default

it can be set to true to disable EventSource pings
which are not supported by some OpenAI clients.

fixes abetlen#1256
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant