From 60699266d349abd29fe6b0072a50c47d4704d0ae Mon Sep 17 00:00:00 2001 From: Dimitri Gilbert Date: Mon, 4 Nov 2024 16:12:45 +0100 Subject: [PATCH] Update configuration.mdx (#817) added ollama configuration detail when running on a server --- docs/src/content/docs/getting-started/configuration.mdx | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/docs/src/content/docs/getting-started/configuration.mdx b/docs/src/content/docs/getting-started/configuration.mdx index 5ee95fb34a..1da01e221a 100644 --- a/docs/src/content/docs/getting-started/configuration.mdx +++ b/docs/src/content/docs/getting-started/configuration.mdx @@ -817,6 +817,13 @@ script({ }) ``` +If Ollama runs on a server or a different computer, you have to configure the `OLLAMA_API_BASE` environment variable. + +```txt OLLAMA_API_BASE +OLLAMA_API_BASE=http://:/v1 +``` +As GenAIScript uses OpenAI style api, you must use the `/v1` endpoints and not `/api`. + ### Llamafile [https://llamafile.ai/](https://llamafile.ai/) is a single file desktop application