UI Not Showing Text Box for Chat #2626
Answered
by
danny-avila
ryancurrah
asked this question in
Troubleshooting
-
I have started LibreChat locally using Docker Compose with MongoDB. It starts up fine, the issue is when I get to the UI I see no text box and in the java script console I get the following log and error. I'm not seeing any relevant info in the logs as well. Any clue as to what I'm missing?
Here is my Docker-Compose: version: '3.8'
services:
mongo:
image: mongo
container_name: mongo
volumes:
- ~/mongo-data:/data/db
ports:
- "27017:27017"
restart: always
librechat:
image: ghcr.io/danny-avila/librechat:v0.7.1
container_name: librechat
environment:
MONGO_URI: mongodb://mongo:27017/LibreChat
CREDS_KEY: fb838dd31c0c0fe8ba79e9a01330b13c0fcfba4066e820fd87a1534ea084c2f3
CREDS_IV: 18663af1799397ae57de52cf348614de
JWT_SECRET: 634f8e3f78d485278ee13cb37f909e0c25214d2760f9e7c957ee099b8e48a3d9
JWT_REFRESH_SECRET: ff24a17e8999b8beb917ec26321fb2d573e9ca6b32b8a499cea4ff7a2961a28a
ALLOW_REGISTRATION: true
volumes:
- ./config.yaml:/app/librechat.yaml
ports:
- "3080:3080"
restart: unless-stopped
depends_on:
- mongo Here is my config: version: 1.0.6
endpoints:
custom:
- name: vllm
baseURL: http://[REDACTED]/v1
models:
default: ["meta-llama/Meta-Llama-3-70B"]
fetch: true
titleConvo: true
titleModel: "meta-llama/Meta-Llama-3-70B"
summarize: false
summaryModel: "meta-llama/Meta-Llama-3-70B"
forcePrompt: false
addParams:
"stop": [
"<|start_header_id|>",
"<|end_header_id|>",
"<|eot_id|>",
"<|reserved_special_token"
] Logs:
|
Beta Was this translation helpful? Give feedback.
Answered by
danny-avila
May 6, 2024
Replies: 1 comment 1 reply
-
an for now, just add this: - name: vllm
baseURL: http://[REDACTED]/v1
apiKey: vllm See the ollama example for reference: https://docs.librechat.ai/install/configuration/ai_endpoints.html#ollama |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
ryancurrah
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
an
apiKey
is required, but I'll see about a way to prevent that behavior since a lot of local services don't require it.for now, just add this:
See the ollama example for reference: https://docs.librechat.ai/install/configuration/ai_endpoints.html#ollama