Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Litellm/01 is unable to connect to non-openAI providers. #272

Open
mitstastic opened this issue May 13, 2024 · 12 comments
Open

Litellm/01 is unable to connect to non-openAI providers. #272

mitstastic opened this issue May 13, 2024 · 12 comments

Comments

@mitstastic
Copy link

What causes the issue:
Run 01 specifying any non OAI server-host and api key

Expected:
Be able to connect to other services like Groq, Anthropic, OpenRouter etc as the seem to be working with the base Open Intepreter

Screenshots:
Screenshot 2024-05-13 at 3 58 25 AM
Using:

  • macOS Ventura 13.6.5
  • macOS Sonoma 14.4.1
  • Windows 10
  • Python 3.9-3.11.8

Feedback
After many attempts using different settings, it seems either 01 is not passing the right arguments to litellm, or litellm isn't yet correctly configured for other providers for 01

@mitstastic
Copy link
Author

mitstastic commented May 13, 2024

Update: This error above was probably caused by using a url argument with --server-host instead of --server-url erroneously, however the connection still doesn't open with the latter--See pic below

Screenshot 2024-05-13 at 6 09 33 PM

@rwmjhb
Copy link

rwmjhb commented May 14, 2024

This is also the question I want to ask. It turns out that the command line is written like this. Do I need to install the litellm service first and start it to obtain the local connection interface?

@Merlinvt
Copy link

Merlinvt commented May 14, 2024

Since OpenInterpreter uses litellm, I think you need to specify this differently. Here is what I think would work: 'poetry run 01 model "groq/gemma-7b-it --tts-service piper --stt-service local-whisper'."

Litellm already pulls all the data automatically if you specify the provider in the model. Or at least it should do that.

Here are some instructions on how to get it to work with open router: https://discordapp.com/channels/1146610656779440188/1194880263122075688/1240334434352365569

@mitstastic
Copy link
Author

mitstastic commented May 15, 2024

Well as you know in the discord community some people seemed to suggest 01 is automatically appending "openai/" before the model names specified in the arguments. So for instance you might end up with "openai/groq/gemma-7b-it". Is that what's causing the issue?

Litellm already pulls all the data automatically if you specify the provider in the model. Or at least it should do that.

If it does, why the need to specify all the details when people use the open interpreter directly?
And based on my experience when I leave out the server arguments it seems to default to OAI and complain about no OAI key set. So I think something in the litellm code for 01 is probably interfering or not fully configured to support other providers yet, as it's only been confirmed working with GPT.

@rwmjhb
Copy link

rwmjhb commented May 16, 2024

Does the project side care about it? No developer has responded to questions for so many days?

@Merlinvt
Copy link

Merlinvt commented May 16, 2024

If you want to get the 01 to work with open router ( and others? ), you can try this:

Screenshot_from_2024-05-15_18-05-36.png

Screenshot_from_2024-05-15_18-05-49.png

It's still super unintuitive and I think maybe should be made more intuitive. But you can make it work.

The openai key is for whisper and TTS. If you use a local model you can leave this out.

I also forgot the "poetry install" before "poetry run" ?

Different model name would be "openrouter/meta-llama/llama-3-70b"

@rwmjhb
Copy link

rwmjhb commented May 18, 2024

What if I haven't openai key and also local model,how can I use whisper and TTS?Can I only use openrouter apikey for all fucntion?

@Merlinvt
Copy link

Merlinvt commented May 18, 2024

Openrouter does not have whisper. There is a rewrite on the way that implements more options for TTS and STT. https://github.com/KillianLucas/01-rewrite
I don't think they will implement any options in this repo, so without Openai or the local models you might need to wait until the rewrite is done. But I could be wrong. You can use Open Interpreter until then.

@aj47
Copy link

aj47 commented May 19, 2024

this is how i ran 01 with groq and local tts/stt/
changing i.py as per following diff and also running with /
poetry run 01 --stt-service local-whisper --tts-service piper

diff --git a/software/source/server/i.py b/software/source/server/i.py
index bc792fd..f7a7454 100644
--- a/software/source/server/i.py
+++ b/software/source/server/i.py
@@ -185,10 +185,14 @@ def configure_interpreter(interpreter: OpenInterpreter):
     ### SYSTEM MESSAGE
     interpreter.system_message = system_message
 
-    interpreter.llm.supports_vision = True
+    interpreter.llm.supports_vision = False
     interpreter.shrink_images = True  # Faster but less accurate
 
-    interpreter.llm.model = "gpt-4"
+    # RUN WITH THIS COMMAND FOR LOCAL TTS AND STT 
+    # `poetry run 01 --stt-service local-whisper --tts-service piper`
+    interpreter.llm.model = "llama3-70b-8192"
+    interpreter.llm.api_base = "https://api.groq.com/openai/v1/"
+    interpreter.llm.api_key = "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" # YOUR API HERE
 
     interpreter.llm.supports_functions = False
     interpreter.llm.context_window = 110000

@achoozachooz
Copy link

Can you tell which of the line we need to change??

@Merlinvt
Copy link

@aj47 just making sure, that the api key is fake or revoked ;)

@aj47
Copy link

aj47 commented May 20, 2024

@aj47 just making sure, that the api key is fake or revoked ;)

yep all g, revoked before posting

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants