Skip to content

Local LLM Functionary-7b-v2.1-GGUF and Extended OpenAI Conversation #174

Open
@ghost

Description

@jekalmin I am running Functionary-7b-v2.1-GGUF model via Python Bindings for llama.cpp. Here is an example of my python script successfully calling the LLM and returning the correct response for get_current_weather:

image

The problem is, when I issue an almost identical request via Home Assistant and your Extended OpenAI Conversation I get the following error:
image

Here are the Home Assistant Logs:

2024-03-16 01:27:32.831 INFO (MainThread) [custom_components.extended_openai_conversation] Prompt for /data/models/huggingface/models--meetkai--functionary-7b-v2.1-GGUF/snapshots/4386d5a19700bd0ae1582574e3de13a218fb1c8e/functionary-7b-v2.1.q4_0.gguf: [{'role': 'system', 'content': "I want you to act as smart home manager of Home Assistant.\nI will provide information of smart home along with a question, you will truthfully make correction or answer using information provided in one sentence in everyday language.\n\nCurrent Time: 2024-03-16 01:27:32.826442-04:00\n\nAvailable Devices:\n```csv\nentity_id,name,state,aliases\nweather.clusterhome,Local Weather,partlycloudy,\n```\n\nThe current state of devices is provided in available devices.\nUse execute_services function only for requested action, not for current states.\nDo not execute service without user's confirmation.\nDo not restate or appreciate what user says, rather make a quick inquiry."}, {'role': 'user', 'content': 'What is the weather in New York?’}]
2024-03-16 01:27:37.786 INFO (MainThread) [custom_components.extended_openai_conversation] Response {'id': 'chatcmpl-05d7ce5d-c3d2-4fd1-ac71-1af5266ea5d4', 'choices': [{'finish_reason': 'tool_calls', 'index': 0, 'message': {'role': 'assistant', 'function_call': {'arguments': '{}', 'name': ' get_current_weather'}, 'tool_calls': [{'id': 'call_zrdjmFbhRmjp7l9010Jvr4EP', 'function': {'arguments': '{}', 'name': ' get_current_weather'}, 'type': 'function'}]}}], 'created': 1710566857, 'model': '/data/models/huggingface/models--meetkai--functionary-7b-v2.1-GGUF/snapshots/4386d5a19700bd0ae1582574e3de13a218fb1c8e/functionary-7b-v2.1.q4_0.gguf', 'object': 'chat.completion', 'usage': {'completion_tokens': 2, 'prompt_tokens': 349, 'total_tokens': 351}}
2024-03-16 01:27:37.790 ERROR (MainThread) [custom_components.extended_openai_conversation] function ' get_current_weather' does not exist
Traceback (most recent call last):
  File "/config/custom_components/extended_openai_conversation/__init__.py", line 196, in async_process
    query_response = await self.query(user_input, messages, exposed_entities, 0)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/extended_openai_conversation/__init__.py", line 384, in query
    return await self.execute_tool_calls(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/extended_openai_conversation/__init__.py", line 477, in execute_tool_calls
    raise FunctionNotFound(function_name)
custom_components.extended_openai_conversation.exceptions.FunctionNotFound: function ' get_current_weather' does not exist

If you inspect the above logs closely you'll see that it looks like Extended OpenAI Conversation is not passing the spec I defined for get_current_weather. Basically the tools or functions part of the payload is not being passed to the LLM I think? What am I doing wrong here? Any tips?

Btw, here are the options I'm passing into your plug-in. Notice that I am in fact defining the ' get_current_weather' spec:
image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions