Skip to content

Knowledge Agent Issue #234

@InnovationMaitools

Description

@InnovationMaitools

025-07-21 11:58:52.328 INFO {task_manager} [_listen_transcriber] Received transcript, sending for further processing
2025-07-21 11:58:52.329 INFO {task_manager} [_handle_transcriber_output] Running llm Tasks
2025-07-21 11:58:52.856 INFO {_client} [_send_single_request] HTTP Request: POST https://api.openai.com/v1/embeddings "HTTP/1.1 200 OK"
2025-07-21 11:58:52.868 INFO {base} [query] query_type :, vector
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/bolna/agent_manager/task_manager.py", line 1389, in _run_llm_task
await self._process_conversation_task(message, sequence, meta_info)
File "/usr/local/lib/python3.10/site-packages/bolna/agent_manager/task_manager.py", line 1326, in _process_conversation_task
await self.__do_llm_generation(messages, meta_info, next_step, should_bypass_synth)
File "/usr/local/lib/python3.10/site-packages/bolna/agent_manager/task_manager.py", line 1222, in __do_llm_generation
async for llm_message in self.tools['llm_agent'].generate(messages, synthesize=synthesize, meta_info=meta_info):
File "/usr/local/lib/python3.10/site-packages/bolna/agent_types/knowledgebase_agent.py", line 116, in generate
response = await self.query_engine.aquery(message.content)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/instrumentation/dispatcher.py", line 290, in async_wrapper
result = await func(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/base/base_query_engine.py", line 64, in aquery
query_result = await self._aquery(str_or_query_bundle)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/instrumentation/dispatcher.py", line 290, in async_wrapper
result = await func(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/query_engine/retriever_query_engine.py", line 204, in _aquery
nodes = await self.aretrieve(query_bundle)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/query_engine/retriever_query_engine.py", line 148, in aretrieve
nodes = await self._retriever.aretrieve(query_bundle)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/instrumentation/dispatcher.py", line 290, in async_wrapper
result = await func(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/base/base_retriever.py", line 274, in aretrieve
nodes = await self._aretrieve(query_bundle=query_bundle)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/instrumentation/dispatcher.py", line 290, in async_wrapper
result = await func(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 112, in _aretrieve
return await self._aget_nodes_with_embeddings(
File "/usr/local/lib/python3.10/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 184, in _aget_nodes_with_embeddings
query_result = await self._vector_store.aquery(query, **self._kwargs)
File "/usr/local/lib/python3.10/site-packages/llama_index/core/vector_stores/types.py", line 420, in aquery
return self.query(query, **kwargs)
File "/usr/local/lib/python3.10/site-packages/llama_index/vector_stores/lancedb/base.py", line 484, in query
self._table.search(
AttributeError: 'NoneType' object has no attribute 'search'
2025-07-21 11:58:52.871 ERROR {task_manager} [_run_llm_task] Something went wrong in llm: 'NoneType' object has no attribute 'search'
2025-07-21 11:58:53.309 INFO {deepgram_transcriber} [receiver] Value of is_transcript_sent_for_processing in utterance end - True
2025-07-21 11:58:53.498 INFO {deepgram_transcriber} [receiver] Received SpeechStarted event from deepgram

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions