-
Notifications
You must be signed in to change notification settings - Fork 356
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Openrouter does not work with Cody #6109
Comments
Hey @githubdebugger Since providing own models via Ollama or third-party services is an experimental feature it might work unexpected. Sorry for the inconvenience. |
This will be available in the next Cody pre-release? |
Any updates on this? Will this be available in the official? |
@githubdebugger So far I can't share an update. I am also not aware of any ETA. Sorry |
Please raise the severity, this is an issue that needs attention! |
Version
v1.41.1731027960
Describe the bug
Openrouter does not work with Cody
Added this config in settings.json:
{
"provider": "groq", // keep groq as provider
"model": "qwen/qwen-2.5-coder-32b-instruct",
"inputTokens": 128000,
"outputTokens": 8192,
"apiKey": "<api_key>",
"apiEndpoint": "https://openrouter.ai/api/v1/chat/completions"
},
Using groq as the provider with openrouter endpoint (as groq is the provider which needs to be used for OpenAI compatible API).
And this is the response:
Request Failed: HTTP 400 Bad Request: {"error":{"message":"qwen-2.5-coder-32b-instruct is not a valid model ID","code":400}}
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ?
Expected behavior
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ? and the expected behaviour is to get a response from the endpoint.
Additional context
No response
The text was updated successfully, but these errors were encountered: