Skip to content

Ollama connection Issue #510

@SwissKerim

Description

@SwissKerim

Describe the bug

"If you are using Ollama locally, this is likely the CORS (Cross-Origin Resource Sharing) security issue, where you will need to set OLLAMA_ORIGINS=* or OLLAMA_ORIGINS=https://airi.moeru.ai,http://localhost environment variable before launching Ollama server to make this work."

Image

But after continue, i can see the models but cannot select "Save and Continue".

Image

I'm using ollama in a docker container and had to set

  • OLLAMA_ORIGINS=*

Because

didn't work.

Is there a issue with Ollama or am i just misconfiguring?

Curl from Host system:

curl http://localhost:11434/api/version
{"version":"0.9.5"}

System Info

Windows 11, Airi Desktop Version
Ollama running on Docker with compose:
  ollama:
    image: ollama/ollama
    container_name: ollama
    ports:
      - "11434:11434"
    volumes:
      - /mnt/host/d/Container/ollama:/root/.ollama
    environment:
      - OLLAMA_HOST=0.0.0.0
      - NVIDIA_VISIBLE_DEVICES=all
      - NVIDIA_DRIVER_CAPABILITIES=compute,utility
      - OLLAMA_ORIGINS=*
    runtime: nvidia
    restart: unless-stopped

Validations

  • Follow our Code of Conduct
  • Read the Contributing Guide.
  • Check that there isn't already an issue that reports the same bug to avoid creating a duplicate.
  • Check that this is a concrete bug. For Q&A, please open a GitHub Discussion instead.

Contributions

  • I am willing to submit a PR to fix this issue
  • I am willing to submit a PR with failing tests (actually just go ahead and do it, thanks!)

Metadata

Metadata

Assignees

No one assigned

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions