Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Logprobs in Chat completion response #60

Open
soetedja opened this issue Aug 5, 2024 · 4 comments
Open

Adding Logprobs in Chat completion response #60

soetedja opened this issue Aug 5, 2024 · 4 comments

Comments

@soetedja
Copy link

soetedja commented Aug 5, 2024

Hi,

Will the lmstudio chat completion response be added to a logprobs in the response?

Currently, when using OpenAI in .NET library the CreateChatCompletionResponseChoices
the logprobs is a required property and will result in an error.

    [global::System.Text.Json.Serialization.JsonPropertyName("logprobs")]
    [global::System.Text.Json.Serialization.JsonRequired]
    public required global::OpenAI.CreateChatCompletionResponseChoicesLogprobs? Logprobs { get; set; }

Thank you.

@stygmate
Copy link

+1 as it will permit to evaluate confiance score

@CarloNicolini
Copy link

Logprobs are still not available in the response.choices[0] output with LMStudio 0.3.5, despite the fact that no error are thrown when logprobs=True is passed.
Any idea if it will be part of a next release?

@stygmate
Copy link

+232 for this feature 😅

@Nayjest
Copy link

Nayjest commented Mar 19, 2025

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants