Is it possible to use Gemini or another llm? #362
Replies: 1 comment
-
Hello @FabrizioDG Thanks for the suggestion. I've created an issue to track this going forward: #402. Feel free to pitch in with your suggestions in the issue threads. Thanks! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Dear developers,
First of all I want to congratulate you for this interesting and useful project.
I am evaluating if I can customize cognita for a project that I want to develop, but for this project I strictly need to use Gemini LLM. I took a look at the code structure and I have some ideas to where I should touch to make the code work with Gemini, but at the moment it is difficult for me to understand if there is something under the hood that won't work properly with an llm which doesn't have the openai api_format. I saw that in models_config.yaml there are a few options with local llms and openai, and I noticed they all have "api_format:openai". I couldn't really understand how is this api_format used for and I am not able to understand if using a llm which doesn't have the same api format as openai would not work with cognita.
In summary: my idea was to add a new provider gemini in models_config, add gemini_api_key and modify in model gateway using langchain-google-genai, but I am not sure if this would work or if there is something under the hood that will have conflicts. How difficult would it be to make this customization?
Thank you for your help!
Beta Was this translation helpful? Give feedback.
All reactions