Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

custom_llm_provider = "oobabooga" response failing with Attribute error in utils.py - handle_openai_chat_completion_chunk() #1180

Open
Domino9752 opened this issue Apr 6, 2024 · 1 comment

Comments

@Domino9752
Copy link

Describe the bug

Using the oobabooga API to access my local model fails to communicate successfully. I can send information to the model via the oobabooga API, and can receive it back to oogabooga.py. ( stream=true via **optional_params even when I try to set it false). The data from the llm is sent back out via the line "return response.iter_lines()"

The error occurs in utils.py - handle_openai_chat_completion_chunk() , saying the byte string (str_line ) does not have attribute delta. OK, I know what that means; so I added the classes into utils.py:

class MyDelta:
def init(self, content, role, function_call=None, tool_calls=None):
self.content = content
self.role = role
self.function_call = function_call
self.tool_calls = tool_calls

class MyChoice:
def init(self, delta, finish_reason, index):
self.delta = MyDelta(**delta) if isinstance(delta, dict) else delta
self.finish_reason = finish_reason
self.index = index

class MyDictString:
def init(self, choices, created, id, model, object):
self.choices = [MyChoice(**choice) for choice in choices]
self.created = created
self.id = id
self.model = model
self.object = object
self.system_fingerprint = "oobabooga"

and a few lines to use them:

        org_line = chunk
        str_line = org_line.decode('utf-8') # bytes to string
        if (str_line[:5] == "data:"):
            str_dict = json.loads(str_line[5:]) # string to dictionary
        else:
            str_dict = json.loads(str_line) # string to dictionary
        str_line = MyDictString(str_dict['choices'], str_dict['created'], str_dict['id'], str_dict['model'], str_dict['object'])

and the error(s) "go away", but... yeah. This isn't right.

I have reinstalled, but no joy. Any idea how I got onto this mess? I assume "things have changed"...

Reproduce

I have installed open-interpreter twice on windows 10; I am familiar with the oobabooga API and its documentation.
python -m pip install open-interpreter. ( I have also tried python -m pip install litellm --upgrade)
NOTE: Thus I am aware of how fast code changes are being made.

I am running open-interpreter from python, and have experimented with the parameters. I am currently using:

from interpreter import interpreter

interpreter.offline = True
interpreter.verbose = False
interpreter.llm.model = "oobabooga/TheBloke_CodeLlama-7B-Python-GPTQ_gptq-4bit-32g-actorder_True"
interpreter.llm.api_base = "http://127.0.0.1:5000"
interpreter.llm.api_key = None
interpreter.llm.api_version = '2.0.2'
interpreter.llm.temperature = 0
interpreter.llm.context_window = 4096
interpreter.llm.max_tokens = 1024

interpreter.chat()

Expected behavior

a dialog that would allow me to create and run simple python code locally on my computer

Screenshots

No response

Open Interpreter version

Version: 0.2.4

Python version

Python 3.11.4

Operating System name and version

Windows 10

Additional context

No response

@Domino9752
Copy link
Author

After getting this to work it is clear to me that this is an issue with Litellm, and not open-interpreter. Basically oobabooga modified their API a couple weeks ago and Litellm last related update was 4 months ago. I need to post a PR over there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant