Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using authenticated Ollama instance #1422

Open
iamNoah1 opened this issue Dec 28, 2024 · 6 comments
Open

Using authenticated Ollama instance #1422

iamNoah1 opened this issue Dec 28, 2024 · 6 comments

Comments

@iamNoah1
Copy link

We are using an authenticated Ollama server where we pass an X-API-key as header with our requests. Is this supported by this repo? If not I am happy to make a contribution if someone can hint me to the right direction :)

@mrT23
Copy link
Collaborator

mrT23 commented Dec 29, 2024

give example. How do you call the model currently (code) ? can you use litellm to call it ?

https://docs.litellm.ai/docs/providers/ollama

(I am hinting that if it is supported by litellm, it should be supported by us)

@iamNoah1
Copy link
Author

Hei @mrT23, thanks for getting back. This would be the example using litellm.

from litellm import completion

api_base = "redacted"

model_name = "ollama/llama3.1"
messages = [
    {"role": "user", "content": "What is the capital of France?"}
]

headers = {
    "X-API-key": "redacted"
}

response = completion(
    model=model_name,
    messages=messages,
    api_base=api_base,
    headers=headers,
    max_tokens=50
)

print(response)

@mrT23
Copy link
Collaborator

mrT23 commented Jan 1, 2025

If Litellm supports it, pr-agent probably can. But I am not sure they do. try to look in their documentation

@iamNoah1
Copy link
Author

iamNoah1 commented Jan 2, 2025

@mrT23 litellm supports it, hence the code snippet I shared. It is a working code snippet. Question is how can I configure pr-agent to make use of this litellm feature? Or if it is not supported by pr-agent maybe a hint on where to look for so I can do a contribution to add the support for it.

@mrT23
Copy link
Collaborator

mrT23 commented Jan 2, 2025

https://github.com/Codium-ai/pr-agent-pro/blob/3bd37b533152e58e5b8eea67318ae93493db2079/pr_agent/algo/ai_handlers/litellm_ai_handler.py#L256

Something like this:

if get_setting().ollama.get('header')
    kwargs['header']=get_setting().ollama.header
``

@mrT23
Copy link
Collaborator

mrT23 commented Jan 2, 2025

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants