Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Palm, Claude-2, Llama2, CodeLlama (100+LLMs) #129

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

ishaan-jaff
Copy link

This PR adds support for the above mentioned LLMs using LiteLLM https://github.com/BerriAI/litellm/

Example

from litellm import completion

## set ENV variables
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion("command-nightly", messages)

# anthropic call
response = completion(model="claude-instant-1", messages=messages)

@vercel
Copy link

vercel bot commented Sep 9, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
aixplora ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 9, 2023 4:09pm

@ishaan-jaff
Copy link
Author

addressing: #128

@ishaan-jaff
Copy link
Author

@grumpyp can you take a look at this pr when possible thanks !

@grumpyp
Copy link
Owner

grumpyp commented Sep 9, 2023

Thanks for the contribution. Would you also implement it from the frontend side? Does this implementation download the LLMs to your machine?

If yes, did you have a look how the other LLMs currently would be stored? So it would make sense to do it in the same way :)

Thx!

@ishaan-jaff
Copy link
Author

  • can we address the front end in a separate pr ?
  • this does not download any llms to your machine

@grumpyp
Copy link
Owner

grumpyp commented Sep 11, 2023

Why should we seperate it in another PR?

Ok as far as I understand it just uses the LLMs provided by 3rd party API's?

Thats fine, as long as it is compatible with our current implementation. Is there a list of all available LLMs which can be used with litellm? So we could think of how to implement it in a nice way in frontend side.

Or at least add some tests to this PR please.

@krrishdholakia
Copy link

@grumpyp

re: provider list,

yes here are the docs - https://docs.litellm.ai/docs/providers

via code:

import litellm

print(litellm.provider_list)

@grumpyp
Copy link
Owner

grumpyp commented Oct 5, 2023

@grumpyp

re: provider list,

yes here are the docs - https://docs.litellm.ai/docs/providers

via code:

import litellm

print(litellm.provider_list)

Feel free to test everything.

If it doesn't break, I'll be happy to merge it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants