Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature] Is there a plan for an OpenAI API compatible reverse proxy? #67

Open
vibl opened this issue Oct 6, 2023 · 6 comments
Open
Labels
enhancement New feature or request

Comments

@vibl
Copy link

vibl commented Oct 6, 2023

Is there a plan for an OpenAI API compatible reverse proxy?

Most chat UI apps and dataflow frameworks have a connector to the OpenAI API, so a reverse proxy mimicking that API would be a godsend.

@vibl
Copy link
Author

vibl commented Oct 6, 2023

We could leverage https://github.com/xtekky/gpt4free for that.

@snowby666 snowby666 added the enhancement New feature or request label Oct 10, 2023
@snowby666 snowby666 pinned this issue Oct 10, 2023
@Aws-killer
Copy link

Aws-killer commented Oct 17, 2023

I think you can use https://github.com/BerriAI/litellm/tree/main, It shouldn't be soo hard.....(You can call the repo PoLite 🤣🤣🤣🤣)

@Mchar7
Copy link

Mchar7 commented Dec 19, 2023

I made a bare-bones OpenAI compatible API using your library and have been using it locally for a few hours. It currently only mimics the /v1/chat/completions (no text streaming yet) and /v1/models endpoints, but I plan on implementing Messages, Threads, and Assistants as Poe messages, chats, and bots respectively.

Would you accept a pull request adding this functionality to the library? (Or a variant, à la poe-api-wrapper[openai-proxy] with an additional Flask dependency) It could add something like poe-openai-server=poe_api_wrapper.openai-server:main to entry_points in setup.py. Or, would you rather I publish this as a separate project (with your library acknowledged, of course)?

@snowby666
Copy link
Owner

I made a bare-bones OpenAI compatible API using your library and have been using it locally for a few hours. It currently only mimics the /v1/chat/completions (no text streaming yet) and /v1/models endpoints, but I plan on implementing Messages, Threads, and Assistants as Poe messages, chats, and bots respectively.

Would you accept a pull request adding this functionality to the library? (Or a variant, à la poe-api-wrapper[openai-proxy] with an additional Flask dependency) It could add something like poe-openai-server=poe_api_wrapper.openai-server:main to entry_points in setup.py. Or, would you rather I publish this as a separate project (with your library acknowledged, of course)?

Sounds cool! Integrating your OpenAI wrapper into the library is a good idea. I'd be happy to take a look at a pull request with that functionality added.
Love to see what you come up with.

@ishaan-jaff
Copy link

Hi @Aws-killer do you use LiteLLM Proxy ? Can we hop on a call to learn how we can make litellm better for you?

Link to my calendar for your convenience

@comeback01
Copy link

is there a release date?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants