OpenAI o3-pro model unusable in Zed due to Responses API #33111
jvmncs
started this conversation in
LLMs and Zed Agent
Replies: 1 comment
-
FYI I got o3-pro to work via openrouter, but only in minimal mode...tool use breaks it, although I might've configured it wrong since none of the o1, o3, etc models work for me via openrouter. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
The new
o3-pro
model from OpenAI is only available in their newer, statefulResponses
API, and not in the Chat Completions API that Zed uses for OpenAI models. This feature request is to expand the scope of Zed's OpenAI client to support usage of theResponses
API so thato3-pro
can be used in Zed.Beta Was this translation helpful? Give feedback.
All reactions