-
Notifications
You must be signed in to change notification settings - Fork 633
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using authenticated Ollama instance #1422
Comments
give example. How do you call the model currently (code) ? can you use litellm to call it ? https://docs.litellm.ai/docs/providers/ollama (I am hinting that if it is supported by litellm, it should be supported by us) |
Hei @mrT23, thanks for getting back. This would be the example using litellm.
|
If Litellm supports it, pr-agent probably can. But I am not sure they do. try to look in their documentation |
@mrT23 litellm supports it, hence the code snippet I shared. It is a working code snippet. Question is how can I configure pr-agent to make use of this litellm feature? Or if it is not supported by pr-agent maybe a hint on where to look for so I can do a contribution to add the support for it. |
Something like this:
|
but also review this: https://qodo-merge-docs.qodo.ai/usage-guide/changing_a_model/#ollama |
We are using an authenticated Ollama server where we pass an X-API-key as header with our requests. Is this supported by this repo? If not I am happy to make a contribution if someone can hint me to the right direction :)
The text was updated successfully, but these errors were encountered: