-
Notifications
You must be signed in to change notification settings - Fork 336
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] : Add Local model provider #311
Labels
Comments
@khursheed33 this is supported with the Gateway now! You can explore the docs for this here - https://portkey.ai/docs/welcome/integration-guides/byollm Please let me know if this is helpful! |
@khursheed33 I'll close this issue now, please feel free to reopen if needed! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What Would You Like to See with the Gateway?
Support of Local model, we should be able to use our local models by just passing the baseUrl in the body.
Context for your Request
Support of Local model, we should be able to use our local models by just passing the baseUrl in the body.
Your Twitter/LinkedIn
https://linkedin.com/in/khursheed33
The text was updated successfully, but these errors were encountered: