Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom Generative AI Config instructions are false #3946

Open
dhdaines opened this issue Feb 10, 2025 · 0 comments
Open

Custom Generative AI Config instructions are false #3946

dhdaines opened this issue Feb 10, 2025 · 0 comments

Comments

@dhdaines
Copy link

dhdaines commented Feb 10, 2025

A few issues with the documentation here: https://docs.onyx.app/gen_ai_configs/ollama

  1. You need to add a model under "Model Names". This is not mentioned in the documentation.
  2. What you need to add as a model name is not what the UI claims it to be. One would be inclined to think that it's something like ollama/llama3.2 (except that oops that page you link to doesn't even mention llama3.2...) but no it is just llama3.2.
  3. When deploying with docker-compose you also need to make sure ollama is listening on the appropriate interface so that it can be seen from inside the docker containers. Not sure it does this by default (hopefully not as it is a gaping security hole). You can do it by starting ollama manually with OLLAMA_HOST=172.17.0.1:11434 ./bin/ollama serve for instance.

Looking forward to actually using Onyx now ;-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant