Bug description
chat-ui can't communicate with local VLLM. Possible this is only the case if plain HTTP is used, I don't have SSL certificates.
Steps to reproduce
- Run VLLM on your local PC. Let's say it's the default port, so it's running on http://localhost:8000/v1 for the API. My command below.
- Use 'docker compose up' with the file given below
- On your PC, connect to http://localhost:3000 to load. The webpage should load and the VLLM model should be detected properly.
- Start a new chat and send something. Nothing happens.
VLLM command:
vllm serve $MODEL_PATH \
--served-model-name $MODEL_NAME \
--port 8000 \
--host 0.0.0.0 \
--max-model-len 262144 \
--max-num-seqs 4 \
--enable-prefix-caching \
--gpu-memory-utilization 0.6 \
--load-format fastsafetensors \
--enable-auto-tool-choice \
--tool-call-parser qwen3_coder \
--reasoning-parser qwen3 \
--max-num-batched-tokens 8192 \
--trust-remote-code \
--mm-encoder-tp-mode data \
-O3
chat-ui's docker-compose.yml
services:
chat-ui:
image: ghcr.io/huggingface/chat-ui-db:latest
container_name: chat-ui
init: true
network_mode: host
environment:
- OPENAI_BASE_URL=http://localhost:8000/v1
- OPENAI_API_KEY=not-needed
volumes:
- chat-ui-data:/data
volumes:
chat-ui-data:
Screenshots
Context
Logs
I can see your app successfully fetching the model list and showing them in the UI settings page.
However, when I try to send a question, nothing happens. This exception is thrown in my browser:
GET http://192.168.100.100:3000/api/v2/conversations/69e7f5df4ec7018555a490d2
Error: {"message":"You don't have access to this conversation. If someone gave you this link, ask them to use the 'share' feature instead."}`
No request is sent to VLLM other than /v1/models to get the model list according to VLLM logs.
There's also this error in the browser console log when I load the app which may be related:
Cookie “hf-chat” has been rejected because a non-HTTPS cookie can’t be set as “secure”
Could this be No Cookie -> All conversations denied? Is this because I don't use SSL? That's my business, isn't it?
Specs
- OS: Ubuntu 24.04
- Browser: Firefox
- chat-ui commit: chat-ui-db b1407709b522, latest as of today
Config
Refer to docker-compose.yml
Notes
Bug description
chat-ui can't communicate with local VLLM. Possible this is only the case if plain HTTP is used, I don't have SSL certificates.
Steps to reproduce
VLLM command:
chat-ui's docker-compose.yml
Screenshots
Context
Logs
I can see your app successfully fetching the model list and showing them in the UI settings page.
However, when I try to send a question, nothing happens. This exception is thrown in my browser:
No request is sent to VLLM other than /v1/models to get the model list according to VLLM logs.
There's also this error in the browser console log when I load the app which may be related:
Could this be No Cookie -> All conversations denied? Is this because I don't use SSL? That's my business, isn't it?
Specs
Config
Refer to docker-compose.yml
Notes