You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using the demo for testing, an error occurred: "columns_and_tables_needed = orjson.loads orjson.JSONDecodeError: unexpected character: line 1 column 1".
#1426
Open
wxp2023 opened this issue
Mar 19, 2025
· 1 comment
My question is 'Information of the three individuals with the highest salaries'
The log content is as follows:
I0319 06:15:48.781 8 wren-ai-service:298] db_schemas token count is greater than 100,000, so we will prune columns
INFO: 172.25.0.3:57886 - "GET /v1/asks/21ad676d-2688-4764-a9c2-f11255ddab8d/result HTTP/1.1" 200 OK
********************************************************************************
> construct_retrieval_results [src.pipelines.retrieval.retrieval.construct_retrieval_results()] encountered an error<
> Node inputs:
{'check_using_db_schemas_without_pruning': "<Task finished name='Task-581' "
'coro=<AsyncGraphAda...',
'construct_db_schemas': "<Task finished name='Task-580' "
'coro=<AsyncGraphAda...',
'dbschema_retrieval': "<Task finished name='Task-579' coro=<AsyncGraphAda...",
'filter_columns_in_tables': "<Task finished name='Task-583' "
'coro=<AsyncGraphAda...'}
********************************************************************************
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 122, in new_fn
await fn(**fn_kwargs) if asyncio.iscoroutinefunction(fn) else fn(**fn_kwargs)
^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 256, in sync_wrapper
self._handle_exception(observation, e)
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 520, in _handle_exception
raise e
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 254, in sync_wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/pipelines/retrieval/retrieval.py", line 334, in construct_retrieval_results
columns_and_tables_needed = orjson.loads(
^^^^^^^^^^^^^
orjson.JSONDecodeError: unexpected character: line 1 column 1 (char 0)
-------------------------------------------------------------------
Oh no an error! Need help with Hamilton?
Join our slack and ask for help! https://join.slack.com/t/hamilton-opensource/shared_invite/zt-2niepkra8-DGKGf_tTYhXuJWBTXtIs4g
-------------------------------------------------------------------
E0319 06:15:50.792 8 wren-ai-service:529] ask pipeline - OTHERS: unexpected character: line 1 column 1 (char 0)
Traceback (most recent call last):
File "/src/web/v1/services/ask.py", line 318, in ask
retrieval_result = await self._pipelines["retrieval"].run(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 219, in async_wrapper
self._handle_exception(observation, e)
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 520, in _handle_exception
raise e
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 217, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/pipelines/retrieval/retrieval.py", line 485, in run
return await self._pipe.execute(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 375, in execute
raise e
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 366, in execute
outputs = await self.raw_execute(_final_vars, overrides, display_graph, inputs=inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 326, in raw_execute
raise e
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 321, in raw_execute
results = await await_dict_of_tasks(task_dict)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 23, in await_dict_of_tasks
coroutines_gathered = await asyncio.gather(*coroutines)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 36, in process_value
return await val
^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 122, in new_fn
await fn(**fn_kwargs) if asyncio.iscoroutinefunction(fn) else fn(**fn_kwargs)
^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 256, in sync_wrapper
self._handle_exception(observation, e)
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 520, in _handle_exception
raise e
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 254, in sync_wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/pipelines/retrieval/retrieval.py", line 334, in construct_retrieval_results
columns_and_tables_needed = orjson.loads(
^^^^^^^^^^^^^
orjson.JSONDecodeError: unexpected character: line 1 column 1 (char 0)
The.env configuration is as follows:
COMPOSE_PROJECT_NAME=wrenai
PLATFORM=linux/amd64
PROJECT_DIR=.
# service port
WREN_ENGINE_PORT=8080
WREN_ENGINE_SQL_PORT=7432
WREN_AI_SERVICE_PORT=5555
WREN_UI_PORT=3000
IBIS_SERVER_PORT=8000
WREN_UI_ENDPOINT=http://wren-ui:${WREN_UI_PORT}
# ai service settings
QDRANT_HOST=qdrant
SHOULD_FORCE_DEPLOY=1
# vendor keys
LLM_OPENAI_API_KEY=
EMBEDDER_OPENAI_API_KEY=
LLM_AZURE_OPENAI_API_KEY=
EMBEDDER_AZURE_OPENAI_API_KEY=
QDRANT_API_KEY=
# version
# CHANGE THIS TO THE LATEST VERSION
WREN_PRODUCT_VERSION=0.16.0
WREN_ENGINE_VERSION=0.14.5
WREN_AI_SERVICE_VERSION=0.16.2
IBIS_SERVER_VERSION=0.14.5
WREN_UI_VERSION=0.21.0
WREN_BOOTSTRAP_VERSION=0.1.5
# user id (uuid v4)
USER_UUID=
# for other services
POSTHOG_API_KEY=phc_nhF32aj4xHXOZb0oqr2cn4Oy9uiWzz6CCP4KZmRq9aE
POSTHOG_HOST=https://app.posthog.com
TELEMETRY_ENABLED=true
# this is for telemetry to know the model, i think ai-service might be able to provide a endpoint to get the information
GENERATION_MODEL=gpt-4o-mini
LANGFUSE_SECRET_KEY=
LANGFUSE_PUBLIC_KEY=
# the port exposes to the host
# OPTIONAL: change the port if you have a conflict
HOST_PORT=3000
AI_SERVICE_FORWARD_PORT=5555
# Wren UI
EXPERIMENTAL_ENGINE_RUST_VERSION=false
LLM_OLLAMA_API_KEY=random
EMBEDDER_OLLAMA_API_KEY=random
EMBEDDER_OLLAMA_URL=http://132.120.139.194:11434
LLM_OLLAMA_URL=http://132.120.132.44:11434
OPENAI_API_KEY=dummy-key
@wxp2023 you could setup langfuse through this doc first: https://docs.getwren.ai/oss/ai_service/guide/langfuse_setup
in simple words, the error message means that your llm generates malformed json output. After you setup langfuse, you could check the error details yourself.
My question is 'Information of the three individuals with the highest salaries'
The log content is as follows:
The.env configuration is as follows:
The config configuration is as follows:
The text was updated successfully, but these errors were encountered: