New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding Groq Support #1237
Comments
With @aj47's help ❤️ This is working
This is NOT working
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Is your feature request related to a problem? Please describe.
Hi there :)
I have not seen any other issue or pr on this
I prefer not to use openai api, i dont trust my data with them.
I think if we can get mixtral or llama3-70b (or future 400b) to work with OpenInterpreter
it will be a much needed speed improvement and cost reduction.
Describe the solution you'd like
I would like to be able to use Groq's API instead of OpenAI's
They offer the best-in-the-world inference speeds
They are not yet affiliated with any deep corp (that i am aware of)
plus, their api is free, atleast for now
Describe alternatives you've considered
Honestly, since it is not mentioned at all (not in the readme, not in the docs, issues, or prs)
i thought this wasn't already supported, but reading litellm docs i can see that they do support groq already
Additional context
I have already played with the official groq python api quite a bit, using it in other advanced pipelines, and it seems it will fit well with oi,
but when using
--model groq/mixtral-8x7b-32768
i am getting errors:
I have created a new branch
groq
to try to fix this, bypassing litellm and using the official api directlyI've managed to hook it well into the existing oi flow - no errors :)
but also not getting any code executed...
its doing everything right, but seems to halucinate the result intead of actually running it the code it wrote
i understand that it has to output 'execute' and 'code' but not sure in which format,
can anyone tell me what format oi is expecting ( i currently dont have openai api to test against it )
example for a working output json would be great
seems it's just a matter of making mixtral and other models work well with oi expected output
(ill do more reseach, maybe someone solves this in another issue)
Really hoping to get this to work
Thanks a lot and all the best!
The text was updated successfully, but these errors were encountered: