You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The ability to set a base_url facilitates the use of this plugin with any OpenAI-like API.
For instance, if we set base_url to the address of LiteLLM proxy, it would become possible to use this plugin with a multitude of different LLMs (including open-source local models with Ollama).
Naturally, to prevent breaking changes, the default value for such a setting should be "https://api.openai.com/v1".
In my opinion, this setting should be incorporated in the open_ai settings. This is because, similar to api_key, it's a parameter found in OpenAI’s official Python API:
@S1M0N38 do you think a change like this would allow integration with something that runs locally like LM Studio? That application allows you to interact with the model via a port on localhost. Would love to try a totally offline and free AI coding experience. https://lmstudio.ai/
The ability to set a
base_url
facilitates the use of this plugin with any OpenAI-like API.For instance, if we set
base_url
to the address of LiteLLM proxy, it would become possible to use this plugin with a multitude of different LLMs (including open-source local models with Ollama).Naturally, to prevent breaking changes, the default value for such a setting should be
"https://api.openai.com/v1"
.In my opinion, this setting should be incorporated in the
open_ai
settings. This is because, similar toapi_key
, it's a parameter found in OpenAI’s official Python API:The text was updated successfully, but these errors were encountered: