You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our AI provider is internal within the company, and they provide us with an API. This API is quite different from the standard OpenAI API—it requires extra headers and the return format is also different.
I noticed that the project uses litellm at its core to call large models.
I would like to know if litellm supports this custom API calling method, or do we need to deploy our own proxy service to handle protocol translation?
Feature request
Our company provides an API similar to OpenAI's (
{host}/v1/chat/completions
)。but requires custom headers in the API request.
Does pr-agent support this?
Motivation
Use a Self-Hosted Model
The text was updated successfully, but these errors were encountered: