Skip to content

Unable to see container id for wren AI and in wren Ai having error "Failed to create asking task" #1393

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Nikita23526 opened this issue Mar 12, 2025 · 11 comments
Labels
bug Something isn't working

Comments

@Nikita23526
Copy link

Getting Error "Failed to create asking task"
I have followed the official documentation and used wren-launcher-windows for custom llm in docker container i have id for ollama but not for wren

Expected behavior
I have connected it with mySql and when asking about table it is not responding me with answer just showing an error "Failed to create asking task"

Screenshots

Image

Image

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]

Wren AI Information

  • Version: [e.g, 0.1.0]

Additional context
Add any other context about the problem here.

Relevant log output
-# you should rename this file to config.yaml and put it in ~/.wrenai

please pay attention to the comments starting with # and adjust the config accordingly, 3 steps basically:

1. you need to use your own llm and embedding models

2. you need to use the correct pipe definitions based on https://raw.githubusercontent.com/canner/WrenAI/<WRENAI_VERSION_NUMBER>/docker/config.example.yaml

3. you need to fill in correct llm and embedding models in the pipe definitions

type: llm
provider: litellm_llm
models:


type: embedder
provider: litellm_embedder
models:


type: engine
provider: wren_ui
endpoint: http://wren-ui:3000


type: document_store
provider: qdrant
location: http://qdrant:6333
embedding_model_dim: 768 # put your embedding model dimension here
timeout: 120
recreate_index: true


please change the llm and embedder names to the ones you want to use

the format of llm and embedder should be .<model_name> such as litellm_llm.gpt-4o-2024-08-06

the pipes may be not the latest version, please refer to the latest version: https://raw.githubusercontent.com/canner/WrenAI/<WRENAI_VERSION_NUMBER>/docker/config.example.yaml

type: pipeline
pipes:

  • name: db_schema_indexing
    embedder: litellm_embedder.default
    document_store: qdrant
  • name: historical_question_indexing
    embedder: litellm_embedder.default
    document_store: qdrant
  • name: table_description_indexing
    embedder: litellm_embedder.default
    document_store: qdrant
  • name: db_schema_retrieval
    llm: litellm_llm.default
    embedder: litellm_embedder.default
    document_store: qdrant
  • name: historical_question_retrieval
    embedder: litellm_embedder.default
    document_store: qdrant
  • name: sql_generation
    llm: litellm_llm.default
    engine: wren_ui
  • name: sql_correction
    llm: litellm_llm.default
    engine: wren_ui
  • name: followup_sql_generation
    llm: litellm_llm.default
    engine: wren_ui
  • name: sql_summary
    llm: litellm_llm.default
  • name: sql_answer
    llm: litellm_llm.default
    engine: wren_ui
  • name: sql_breakdown
    llm: litellm_llm.default
    engine: wren_ui
  • name: sql_expansion
    llm: litellm_llm.default
    engine: wren_ui
  • name: semantics_description
    llm: litellm_llm.default
  • name: relationship_recommendation
    llm: litellm_llm.default
    engine: wren_ui
  • name: question_recommendation
    llm: litellm_llm.default
  • name: question_recommendation_db_schema_retrieval
    llm: litellm_llm.default
    embedder: litellm_embedder.default
    document_store: qdrant
  • name: question_recommendation_sql_generation
    llm: litellm_llm.default
    engine: wren_ui
  • name: chart_generation
    llm: litellm_llm.default
  • name: chart_adjustment
    llm: litellm_llm.default
  • name: intent_classification
    llm: litellm_llm.default
    embedder: litellm_embedder.default
    document_store: qdrant
  • name: data_assistance
    llm: litellm_llm.default
  • name: sql_pairs_indexing
    document_store: qdrant
    embedder: litellm_embedder.default
  • name: sql_pairs_retrieval
    document_store: qdrant
    embedder: litellm_embedder.default
    llm: litellm_llm.default
  • name: preprocess_sql_data
    llm: litellm_llm.default
  • name: sql_executor
    engine: wren_ui
  • name: sql_question_generation
    llm: litellm_llm.default
  • name: sql_generation_reasoning
    llm: litellm_llm.default
  • name: sql_regeneration
    llm: litellm_llm.default
    engine: wren_ui

settings:
column_indexing_batch_size: 50
table_retrieval_size: 10
table_column_retrieval_size: 100
allow_using_db_schemas_without_pruning: false # if you want to use db schemas without pruning, set this to true. It will be faster
query_cache_maxsize: 1000
query_cache_ttl: 3600
langfuse_host: https://cloud.langfuse.com
langfuse_enable: true
logging_level: DEBUG
development: true

ON CMD when i wrote docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
9b0e9ea27466 ollama/ollama "/bin/ollama serve" 17 hours ago Up 3 minutes 0.0.0.0:11434->11434/tcp ollama
a79c776ec2a0 ghcr.io/canner/wren-ui:0.20.2 "docker-entrypoint.s…" 18 hours ago Up 3 minutes 0.0.0.0:3000->3000/tcp wrenai-wren-ui-1
d154af9e1e1b ghcr.io/canner/wren-ai-service:0.15.18 "/app/entrypoint.sh" 18 hours ago Up 3 minutes 0.0.0.0:5555->5555/tcp wrenai-wren-ai-service-1
ab4775ac51e3 ghcr.io/canner/wren-engine:0.14.3 "/__cacert_entrypoin…" 18 hours ago Up 3 minutes 7432/tcp, 8080/tcp wrenai-wren-engine-1
9fdd91b4742c ghcr.io/canner/wren-engine-ibis:0.14.3 "fastapi run" 18 hours ago Up 2 minutes 8000/tcp wrenai-ibis-server-1
9969ba55153d ghcr.io/canner/wren-bootstrap:0.1.5 "/bin/sh /app/init.sh" 18 hours ago Exited (0) About a minute ago wrenai-bootstrap-1
c0daefb2a9f6 qdrant/qdrant:v1.11.0 "./entrypoint.sh" 18 hours ago Up 3 minutes 6333-6334/tcp wrenai-qdrant-1
i am getting this

@Nikita23526 Nikita23526 added the bug Something isn't working label Mar 12, 2025
@cyyeh
Copy link
Member

cyyeh commented Mar 13, 2025

@Nikita23526 could you give me the container log of ai service by running docker logs -f wrenai-wren-ai-service-1 ?

@Nikita23526
Copy link
Author

i have provided you ...yesterday i have connected it with mysql with database having single table and provided few question it responded but today i have made a database with 5 tables and when i am asking question it is showing "failed to create task"

docker logs -f wrenai-wren-ai-service-1
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [8]
INFO: Waiting for application startup.
I0315 11:00:36.971 8 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:00:36.991 8 wren-ai-service:64] Registering provider: qdrant
I0315 11:00:36.992 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:00:36.994 8 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:00:38.356 8 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:00:38.357 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:00:41.236 8 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:00:41.236 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:00:41.244 8 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:00:41.244 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:00:41.247 8 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:00:41.247 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:00:41.248 8 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:00:41.251 8 wren-ai-service:64] Registering provider: wren_ui
I0315 11:00:41.251 8 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:00:41.251 8 wren-ai-service:64] Registering provider: wren_engine
I0315 11:00:41.252 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:00:41.254 8 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:00:41.297 8 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:00:41.297 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:00:41.299 8 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:00:41.299 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:00:41.308 8 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:00:41.308 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:00:41.497 8 wren-ai-service:64] Registering provider: openai_llm
I0315 11:00:41.497 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:00:41.497 8 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.
Timeout: wren-ai-service did not start within 60 seconds
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [8]
INFO: Waiting for application startup.
I0315 11:01:28.998 8 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:01:28.999 8 wren-ai-service:64] Registering provider: qdrant
I0315 11:01:29.000 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:01:29.000 8 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:01:29.344 8 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:01:29.344 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:01:30.366 8 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:01:30.366 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:01:30.367 8 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:01:30.367 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:01:30.368 8 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:01:30.368 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:01:30.368 8 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:01:30.369 8 wren-ai-service:64] Registering provider: wren_ui
I0315 11:01:30.369 8 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:01:30.369 8 wren-ai-service:64] Registering provider: wren_engine
I0315 11:01:30.369 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:01:30.370 8 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:01:30.375 8 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:01:30.376 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:01:30.376 8 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:01:30.376 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:01:30.378 8 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:01:30.378 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:01:30.428 8 wren-ai-service:64] Registering provider: openai_llm
I0315 11:01:30.428 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:01:30.428 8 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.
Timeout: wren-ai-service did not start within 60 seconds
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [8]
INFO: Waiting for application startup.
I0315 11:02:30.019 8 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:02:30.020 8 wren-ai-service:64] Registering provider: qdrant
I0315 11:02:30.020 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:02:30.021 8 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:02:30.374 8 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:02:30.374 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:02:31.400 8 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:02:31.400 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:02:31.401 8 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:02:31.401 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:02:31.402 8 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:02:31.402 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:02:31.403 8 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:02:31.403 8 wren-ai-service:64] Registering provider: wren_ui
I0315 11:02:31.403 8 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:02:31.404 8 wren-ai-service:64] Registering provider: wren_engine
I0315 11:02:31.404 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:02:31.404 8 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:02:31.410 8 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:02:31.410 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:02:31.410 8 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:02:31.410 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:02:31.411 8 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:02:31.411 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:02:31.455 8 wren-ai-service:64] Registering provider: openai_llm
I0315 11:02:31.455 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:02:31.455 8 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.
Timeout: wren-ai-service did not start within 60 seconds
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [8]
INFO: Waiting for application startup.
I0315 11:03:31.035 8 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:03:31.036 8 wren-ai-service:64] Registering provider: qdrant
I0315 11:03:31.037 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:03:31.037 8 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:03:31.346 8 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:03:31.346 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:03:32.403 8 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:03:32.403 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:03:32.404 8 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:03:32.404 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:03:32.405 8 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:03:32.405 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:03:32.405 8 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:03:32.406 8 wren-ai-service:64] Registering provider: wren_ui
I0315 11:03:32.406 8 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:03:32.406 8 wren-ai-service:64] Registering provider: wren_engine
I0315 11:03:32.406 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:03:32.406 8 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:03:32.413 8 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:03:32.414 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:03:32.414 8 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:03:32.414 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:03:32.416 8 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:03:32.416 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:03:32.461 8 wren-ai-service:64] Registering provider: openai_llm
I0315 11:03:32.461 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:03:32.461 8 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.
Timeout: wren-ai-service did not start within 60 seconds
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [9]
INFO: Waiting for application startup.
I0315 11:04:32.134 9 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:04:32.136 9 wren-ai-service:64] Registering provider: qdrant
I0315 11:04:32.136 9 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:04:32.137 9 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:04:32.486 9 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:04:32.486 9 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:04:33.487 9 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:04:33.487 9 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:04:33.488 9 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:04:33.488 9 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:04:33.489 9 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:04:33.489 9 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:04:33.489 9 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:04:33.490 9 wren-ai-service:64] Registering provider: wren_ui
I0315 11:04:33.490 9 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:04:33.490 9 wren-ai-service:64] Registering provider: wren_engine
I0315 11:04:33.490 9 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:04:33.491 9 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:04:33.498 9 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:04:33.498 9 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:04:33.499 9 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:04:33.499 9 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:04:33.500 9 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:04:33.500 9 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:04:33.580 9 wren-ai-service:64] Registering provider: openai_llm
I0315 11:04:33.580 9 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:04:33.580 9 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.
Timeout: wren-ai-service did not start within 60 seconds
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [8]
INFO: Waiting for application startup.
I0315 11:05:33.205 8 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:05:33.206 8 wren-ai-service:64] Registering provider: qdrant
I0315 11:05:33.206 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:05:33.206 8 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:05:33.530 8 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:05:33.530 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:05:34.563 8 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:05:34.563 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:05:34.564 8 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:05:34.565 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:05:34.566 8 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:05:34.566 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:05:34.566 8 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:05:34.567 8 wren-ai-service:64] Registering provider: wren_ui
I0315 11:05:34.567 8 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:05:34.567 8 wren-ai-service:64] Registering provider: wren_engine
I0315 11:05:34.567 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:05:34.567 8 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:05:34.573 8 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:05:34.573 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:05:34.573 8 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:05:34.573 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:05:34.575 8 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:05:34.578 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:05:34.650 8 wren-ai-service:64] Registering provider: openai_llm
I0315 11:05:34.650 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:05:34.650 8 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.
Timeout: wren-ai-service did not start within 60 seconds
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [8]
INFO: Waiting for application startup.
I0315 11:06:34.784 8 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:06:34.792 8 wren-ai-service:64] Registering provider: qdrant
I0315 11:06:34.793 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:06:34.793 8 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:06:35.146 8 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:06:35.147 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:06:36.343 8 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:06:36.344 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:06:36.346 8 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:06:36.346 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:06:36.346 8 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:06:36.346 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:06:36.347 8 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:06:36.348 8 wren-ai-service:64] Registering provider: wren_ui
I0315 11:06:36.348 8 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:06:36.348 8 wren-ai-service:64] Registering provider: wren_engine
I0315 11:06:36.348 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:06:36.348 8 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:06:36.354 8 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:06:36.354 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:06:36.354 8 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:06:36.354 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:06:36.355 8 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:06:36.355 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:06:36.392 8 wren-ai-service:64] Registering provider: openai_llm
I0315 11:06:36.392 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:06:36.396 8 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.
Timeout: wren-ai-service did not start within 60 seconds
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [8]
INFO: Waiting for application startup.
I0315 11:07:35.697 8 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:07:35.699 8 wren-ai-service:64] Registering provider: qdrant
I0315 11:07:35.699 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:07:35.699 8 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:07:36.004 8 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:07:36.004 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:07:37.034 8 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:07:37.034 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:07:37.035 8 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:07:37.036 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:07:37.036 8 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:07:37.036 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:07:37.037 8 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:07:37.037 8 wren-ai-service:64] Registering provider: wren_ui
I0315 11:07:37.037 8 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:07:37.037 8 wren-ai-service:64] Registering provider: wren_engine
I0315 11:07:37.038 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:07:37.038 8 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:07:37.044 8 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:07:37.044 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:07:37.045 8 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:07:37.045 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:07:37.046 8 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:07:37.046 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:07:37.085 8 wren-ai-service:64] Registering provider: openai_llm
I0315 11:07:37.085 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:07:37.085 8 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.
Timeout: wren-ai-service did not start within 60 seconds
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [8]
INFO: Waiting for application startup.
I0315 11:08:36.966 8 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:08:36.968 8 wren-ai-service:64] Registering provider: qdrant
I0315 11:08:36.968 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:08:36.968 8 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:08:37.346 8 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:08:37.347 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:08:38.532 8 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:08:38.532 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:08:38.534 8 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:08:38.534 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:08:38.535 8 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:08:38.535 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:08:38.535 8 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:08:38.536 8 wren-ai-service:64] Registering provider: wren_ui
I0315 11:08:38.536 8 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:08:38.536 8 wren-ai-service:64] Registering provider: wren_engine
I0315 11:08:38.536 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:08:38.537 8 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:08:38.544 8 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:08:38.544 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:08:38.544 8 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:08:38.544 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:08:38.546 8 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:08:38.546 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:08:38.606 8 wren-ai-service:64] Registering provider: openai_llm
I0315 11:08:38.606 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:08:38.606 8 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.
Timeout: wren-ai-service did not start within 60 seconds
Waiting for qdrant to start...
qdrant has started.
Waiting for wren-ai-service to start...
INFO: Started server process [8]
INFO: Waiting for application startup.
I0315 11:09:39.218 8 wren-ai-service:40] Imported Provider: src.providers.document_store
I0315 11:09:39.219 8 wren-ai-service:64] Registering provider: qdrant
I0315 11:09:39.219 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
I0315 11:09:39.220 8 wren-ai-service:40] Imported Provider: src.providers.embedder
I0315 11:09:39.786 8 wren-ai-service:64] Registering provider: azure_openai_embedder
I0315 11:09:39.786 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
I0315 11:09:40.977 8 wren-ai-service:64] Registering provider: litellm_embedder
I0315 11:09:40.977 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
I0315 11:09:40.979 8 wren-ai-service:64] Registering provider: ollama_embedder
I0315 11:09:40.979 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
I0315 11:09:40.980 8 wren-ai-service:64] Registering provider: openai_embedder
I0315 11:09:40.980 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
I0315 11:09:40.980 8 wren-ai-service:40] Imported Provider: src.providers.engine
I0315 11:09:40.981 8 wren-ai-service:64] Registering provider: wren_ui
I0315 11:09:40.981 8 wren-ai-service:64] Registering provider: wren_ibis
I0315 11:09:40.981 8 wren-ai-service:64] Registering provider: wren_engine
I0315 11:09:40.981 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
I0315 11:09:40.981 8 wren-ai-service:40] Imported Provider: src.providers.llm
I0315 11:09:40.987 8 wren-ai-service:64] Registering provider: azure_openai_llm
I0315 11:09:40.987 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
I0315 11:09:40.988 8 wren-ai-service:64] Registering provider: litellm_llm
I0315 11:09:40.988 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
I0315 11:09:40.989 8 wren-ai-service:64] Registering provider: ollama_llm
I0315 11:09:40.989 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
I0315 11:09:41.036 8 wren-ai-service:64] Registering provider: openai_llm
I0315 11:09:41.037 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
I0315 11:09:41.037 8 wren-ai-service:40] Imported Provider: src.providers.loader
ERROR: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 692, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
async with original_context(app) as maybe_original_state:
File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/src/main.py", line 32, in lifespan
app.state.service_container = create_service_container(pipe_components, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/globals.py", line 49, in create_service_container
**pipe_components["semantics_description"],
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'semantics_description'

ERROR: Application startup failed. Exiting.

@Nikita23526
Copy link
Author

i am using gemini-flash-2.0 and have provided the api key in .env file also

@cyyeh
Copy link
Member

cyyeh commented Mar 15, 2025

@Nikita23526 the error showed that ai pipelines definitions are incomplete. Please read carefully at the comment and fill in missing pipeline definitions. Thanks

# the pipes may be not the latest version, please refer to the latest version: https://raw.githubusercontent.com/canner/WrenAI/<WRENAI_VERSION_NUMBER>/docker/config.example.yaml

@Nikita23526
Copy link
Author

can you help me out to set up pipeline as i think i have followed the instructions correctly but still same issue "Failed to create task"

@Nikita23526
Copy link
Author

Image why i am getting this

@Nikita23526
Copy link
Author

2025-03-19 11:27:32 wren-ai-service-1 | File "/src/globals.py", line 49, in create_service_container
2025-03-19 11:27:32 wren-ai-service-1 | **pipe_components["semantics_description"],
2025-03-19 11:27:32 wren-ai-service-1 | ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-19 11:27:32 wren-ai-service-1 | KeyError: 'semantics_description'
2025-03-19 11:27:32 wren-ai-service-1 |
2025-03-19 11:27:32 wren-ai-service-1 | ERROR: Application startup failed. Exiting. even though i am having semantic description : ollama.mistral in my pipeline i am getting this erroe

@cyyeh
Copy link
Member

cyyeh commented Mar 19, 2025

2025-03-19 11:27:32 wren-ai-service-1 | File "/src/globals.py", line 49, in create_service_container
2025-03-19 11:27:32 wren-ai-service-1 | **pipe_components["semantics_description"],
2025-03-19 11:27:32 wren-ai-service-1 | ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-19 11:27:32 wren-ai-service-1 | KeyError: 'semantics_description'
2025-03-19 11:27:32 wren-ai-service-1 |
2025-03-19 11:27:32 wren-ai-service-1 | ERROR: Application startup failed. Exiting. even though i am having semantic description : ollama.mistral in my pipeline i am getting this erroe

Could u share your config.yaml?

@Nikita23526
Copy link
Author

Nikita23526 commented Mar 19, 2025

@cyyeh # -------------------------------

LLM Configuration (Ollama Mistral)

-------------------------------

type: llm
provider: ollama
timeout: 120
models:

-------------------------------

Embedding Model Configuration

-------------------------------

type: embedder
provider: ollama
models:

-------------------------------

Wren Engine Configuration

-------------------------------

type: engine
provider: wren_ui
endpoint: http://wren-ui:3000

-------------------------------

Document Store Configuration

-------------------------------

type: document_store
provider: qdrant
location: http://qdrant:6333
embedding_model_dim: 3072
timeout: 120
recreate_index: true

-------------------------------

AI Pipeline Configuration

-------------------------------

type: pipeline
pipes:

  • name: db_schema_indexing
    embedder: ollama_embedder.text-embedding-3-large
    document_store: qdrant
  • name: historical_question_indexing
    embedder: ollama_embedder.text-embedding-3-large
    document_store: qdrant
  • name: table_description_indexing
    embedder: ollama_embedder.text-embedding-3-large
    document_store: qdrant
  • name: db_schema_retrieval
    llm: ollama.mistral
    embedder: ollama_embedder.text-embedding-3-large
    document_store: qdrant
  • name: historical_question_retrieval
    embedder: ollama_embedder.text-embedding-3-large
    document_store: qdrant
  • name: sql_generation
    llm: ollama.mistral
    engine: wren_ui
  • name: sql_correction
    llm: ollama.mistral
    engine: wren_ui
  • name: followup_sql_generation
    llm: ollama.mistral
    engine: wren_ui
  • name: sql_summary
    llm: ollama.mistral
  • name: sql_answer
    llm: ollama.mistral
  • name: sql_breakdown
    llm: ollama.mistral
    engine: wren_ui
  • name: sql_expansion
    llm: ollama.mistral
    engine: wren_ui
  • name: semantics_description
    llm: ollama.mistral
  • name: relationship_recommendation
    llm: ollama.mistral
    engine: wren_ui
  • name: question_recommendation
    llm: ollama.mistral
  • name: question_recommendation_db_schema_retrieval
    llm: ollama.mistral
    embedder: ollama_embedder.text-embedding-3-large
    document_store: qdrant
  • name: question_recommendation_sql_generation
    llm: ollama.mistral
    engine: wren_ui
  • name: intent_classification
    llm: ollama.mistral
    embedder: ollama_embedder.text-embedding-3-large
    document_store: qdrant
  • name: data_assistance
    llm: ollama.mistral
  • name: sql_pairs_indexing
    document_store: qdrant
    embedder: ollama_embedder.text-embedding-3-large
  • name: sql_pairs_retrieval
    document_store: qdrant
    embedder: ollama_embedder.text-embedding-3-large
    llm: ollama.mistral
  • name: preprocess_sql_data
    llm: ollama.mistral
  • name: sql_executor
    engine: wren_ui
  • name: chart_generation
    llm: ollama.mistral
  • name: chart_adjustment
    llm: ollama.mistral
  • name: sql_question_generation
    llm: ollama.mistral
  • name: sql_generation_reasoning
    llm: ollama.mistral
  • name: sql_regeneration
    llm: ollama.mistral
    engine: wren_ui

-------------------------------

General Settings

-------------------------------

settings:
engine_timeout: 30
column_indexing_batch_size: 50
table_retrieval_size: 10
table_column_retrieval_size: 100
allow_using_db_schemas_without_pruning: false
query_cache_maxsize: 1000
query_cache_ttl: 3600
langfuse_host: https://cloud.langfuse.com
langfuse_enable: true
logging_level: DEBUG
development: false

@Nikita23526
Copy link
Author

@cyyeh please review

@Marsedward
Copy link

The bootstrap container in Docker cannot run and connect to Lm studio, please help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants