Skip to content

添加 OpenRouter 作为 LLM 服务提供商 #6484

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

hyc1230
Copy link

@hyc1230 hyc1230 commented May 11, 2025

💻 变更类型 | Change Type

  • feat
  • fix
  • refactor
  • perf
  • style
  • test
  • docs
  • ci
  • chore
  • build

🔀 变更说明 | Description of Change

  • 添加 LLM 提供提供商 OpenRouter。
  • OpenRouter 提供的模型,必须在 CUSTOM_MODELS 环境变量中配置。(除 openrouter/auto 外,因为 OpenRouter 的模型列表太长了)
  • 对应的中/英文档内容。

📝 补充信息 | Additional Information

此 PR 与目前已有的关于 OpenRouter 的 PR #6324 的区别:

  • 支持显示由 OpenRouter 提供的模型的推理内容。
  • 模型图标的识别更准确。

Summary by CodeRabbit

  • New Features

    • Added support for OpenRouter as a new model provider, including API routing, client integration, and model handling.
    • Introduced settings UI for OpenRouter API key and endpoint configuration.
    • Enabled OpenRouter model usage with chat functionality, streaming support, and model listing.
    • Added environment variables for OpenRouter API key and URL configuration.
  • Documentation

    • Updated README files with guidance on configuring OpenRouter models and environment variables.
  • Localization

    • Added OpenRouter-related translations for English, Chinese, and Danish.
  • Bug Fixes

    • Enhanced icon assignment to recognize additional model name prefixes.

Copy link

vercel bot commented May 11, 2025

@hyc1230 is attempting to deploy a commit to the NextChat Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Contributor

coderabbitai bot commented May 11, 2025

Walkthrough

This change introduces comprehensive support for the OpenRouter provider throughout the application. It adds OpenRouter to environment configuration, client and server APIs, UI settings, localization files, and documentation. New logic is implemented for API routing, authentication, model listing, and chat interactions specific to OpenRouter, with corresponding updates to environment variables and user interface components.

Changes

File(s) Change Summary
.env.template, README.md, README_CN.md Added OPENROUTER_API_KEY and OPENROUTER_URL environment variables; updated documentation to explain OpenRouter configuration and usage, including model registration examples.
app/constant.ts Added OpenRouter to provider enums and constants; defined base URL, API paths, and default model; extended DEFAULT_MODELS with OpenRouter metadata.
app/config/server.ts Added OPENROUTER_API_KEY and OPENROUTER_URL to environment config; included OpenRouter-specific fields in server-side config object.
app/store/access.ts Added OpenRouter API key and URL to access state; implemented validation and authorization logic for OpenRouter credentials.
app/client/api.ts Integrated OpenRouter into client API logic, including header management, instantiation, and provider selection.
app/client/platforms/openrouter.ts Introduced new OpenRouterApi class implementing OpenRouter-specific chat, model listing, and request logic, including streaming and vision support.
app/api/openrouter.ts Added new OpenRouter API handler for proxying requests, handling authentication, CORS, model restriction, and response streaming.
app/api/auth.ts Updated authentication logic to support system API key usage for OpenRouter provider.
app/api/[provider]/[...path]/route.ts Routed OpenRouter API path to the new OpenRouter handler.
app/components/settings.tsx Added OpenRouter configuration UI for endpoint and API key in the settings panel.
app/components/emoji.tsx Extended model name prefix checks to include OpenRouter and other namespaced model identifiers for icon assignment.
app/locales/en.ts, app/locales/cn.ts, app/locales/da.ts Added localized UI strings for OpenRouter API key and endpoint configuration in English, Chinese, and Danish.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant UI_Settings
    participant AccessStore
    participant ClientApi
    participant OpenRouterApi
    participant ServerAPI
    participant OpenRouterBackend

    User->>UI_Settings: Selects OpenRouter provider, enters API key and endpoint
    UI_Settings->>AccessStore: Stores OpenRouter credentials
    User->>ClientApi: Initiates chat request (OpenRouter)
    ClientApi->>OpenRouterApi: Calls chat() with message
    OpenRouterApi->>ServerAPI: Sends proxied request with credentials
    ServerAPI->>OpenRouterBackend: Forwards request, handles auth and model restrictions
    OpenRouterBackend-->>ServerAPI: Returns model response (streaming/non-streaming)
    ServerAPI-->>OpenRouterApi: Returns processed response
    OpenRouterApi-->>ClientApi: Delivers chat result
    ClientApi-->>User: Displays model response
Loading

Possibly related PRs

Suggested labels

planned, enhancement

Suggested reviewers

  • Dogtiti

Poem

In fields of code, a rabbit hopped,
With OpenRouter seeds, it stopped.
It planted keys and endpoints new,
Sprouting models in the UI view.
Now chats can stream and models sing,
As OpenRouter joins the spring!
🐇✨

Tip

⚡️ Faster reviews with caching
  • CodeRabbit now supports caching for code and dependencies, helping speed up reviews. This means quicker feedback, reduced wait times, and a smoother review experience overall. Cached data is encrypted and stored securely. This feature will be automatically enabled for all accounts on May 16th. To opt out, configure Review - Disable Cache at either the organization or repository level. If you prefer to disable all data retention across your organization, simply turn off the Data Retention setting under your Organization Settings.

Enjoy the performance boost—your workflow just got faster.

✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (6)
app/api/openrouter.ts (2)

44-47: Stale comment & path handling clarity

The comment still references “alibaba”, which can confuse future maintainers. Suggest updating to reflect OpenRouter or removing entirely.


69-79: Forward original headers for better compatibility

Only Content-Type and Authorization are relayed. Many OpenRouter models expect X-Title, HTTP-Referer, or other custom headers. Forwarding all client-supplied headers (except ones you intentionally strip) improves compatibility:

-    headers: {
-      "Content-Type": "application/json",
-      Authorization: req.headers.get("Authorization") ?? "",
-    },
+    headers: Object.fromEntries(req.headers),
app/constant.ts (1)

852-856: Provider sort index inconsistent with client implementation

DEFAULT_MODELS adds OpenRouter with sorted: 15, while app/client/platforms/openrouter.ts returns models with sorted: 14. This discrepancy can break deterministic ordering in UI lists.

Pick one value (preferably 15 to stay monotonic after SiliconFlow’s 14) and use it consistently.

-        sorted: 14,
+        sorted: 15,
app/client/platforms/openrouter.ts (3)

1-3: Out-of-date header comment

The opening comment still mentions “azure and openai”. Consider updating to avoid confusion.

-// azure and openai, using same models. so using same LLMApi.
+// Client implementation for OpenRouter LLM provider.

71-73: Log messages reference the wrong provider

Both logs label the payload as “openai”, which can mislead when debugging.

- console.log("[Request] openai payload: ", requestPayload);
+ console.log("[Request] openrouter payload: ", requestPayload);

Also applies to: 120-121


126-134: Required OpenRouter headers not sent

OpenRouter’s docs require a Referer (or HTTP-Referer) header identifying the calling site and often an X-Title. Consider enriching getHeaders() for the OpenRouter branch or injecting here:

headers: {
  ...getHeaders(),
  Referer: location.origin,
  "X-Title": "NextChat",
},
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3809375 and 2ec8372.

📒 Files selected for processing (16)
  • .env.template (1 hunks)
  • README.md (2 hunks)
  • README_CN.md (2 hunks)
  • app/api/[provider]/[...path]/route.ts (2 hunks)
  • app/api/auth.ts (1 hunks)
  • app/api/openrouter.ts (1 hunks)
  • app/client/api.ts (7 hunks)
  • app/client/platforms/openrouter.ts (1 hunks)
  • app/components/emoji.tsx (2 hunks)
  • app/components/settings.tsx (3 hunks)
  • app/config/server.ts (3 hunks)
  • app/constant.ts (7 hunks)
  • app/locales/cn.ts (1 hunks)
  • app/locales/da.ts (1 hunks)
  • app/locales/en.ts (1 hunks)
  • app/store/access.ts (5 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (4)
app/store/access.ts (1)
app/constant.ts (1)
  • OPENROUTER_BASE_URL (39-39)
app/client/api.ts (1)
app/client/platforms/openrouter.ts (1)
  • OpenRouterApi (43-287)
app/components/settings.tsx (2)
app/components/ui-lib.tsx (2)
  • ListItem (54-88)
  • PasswordInput (272-295)
app/constant.ts (1)
  • OpenRouter (274-278)
app/api/openrouter.ts (4)
app/config/server.ts (1)
  • getServerSideConfig (132-278)
app/api/auth.ts (1)
  • auth (27-132)
app/constant.ts (1)
  • OPENROUTER_BASE_URL (39-39)
app/utils/model.ts (1)
  • isModelNotavailableInServer (230-258)
🪛 LanguageTool
README.md

[uncategorized] ~315-~315: You might be missing the article “the” here.
Context: ...ow qwen/qwen3-32b:free(OpenRouter) in model list. For Azure: use `modelName@Azure=...

(AI_EN_LECTOR_MISSING_DETERMINER_THE)

🔇 Additional comments (28)
.env.template (1)

84-89: Correctly implemented OpenRouter environment variables

The new environment variables for OpenRouter integration are properly documented and follow the existing pattern in the file. Both OPENROUTER_API_KEY and OPENROUTER_URL are marked as optional with clear comments.

README.md (2)

314-316: Well-documented OpenRouter model configuration

The documentation clearly explains how to manually configure models from OpenRouter using the CUSTOM_MODELS environment variable. The provided example with +qwen/qwen3-32b:free@OpenRouter is helpful for users to understand the syntax.

🧰 Tools
🪛 LanguageTool

[uncategorized] ~315-~315: You might be missing the article “the” here.
Context: ...ow qwen/qwen3-32b:free(OpenRouter) in model list. For Azure: use `modelName@Azure=...

(AI_EN_LECTOR_MISSING_DETERMINER_THE)


367-374: Properly documented environment variables

The new environment variables for OpenRouter are well-documented following the same format as other providers. This ensures users have the necessary information to configure the OpenRouter integration.

app/api/auth.ts (1)

107-109: Correctly implemented OpenRouter authentication

The implementation for OpenRouter authentication follows the established pattern in the file. The switch case properly assigns the OpenRouter API key from server configuration when no user API key is provided.

app/api/[provider]/[...path]/route.ts (2)

18-18: Properly imported OpenRouter handler

The OpenRouter handler import follows the same pattern as other provider handlers.


54-55: Correctly implemented OpenRouter routing

The OpenRouter API path case is properly integrated into the routing switch statement, following the established pattern for other providers.

app/locales/cn.ts (1)

510-520: Clean implementation of OpenRouter localization strings

The addition of OpenRouter configuration strings follows the same structure used for other providers, with clear and consistent translations for API key and endpoint settings.

app/locales/en.ts (1)

494-504: Good addition of English translations for OpenRouter

The implementation correctly adds English localization strings for OpenRouter configuration options, maintaining consistency with other provider settings.

app/components/emoji.tsx (4)

60-61: Support for OpenAI model prefixes expanded

The addition of "o3" model prefix recognition and "openai/" namespace pattern improves icon handling for various OpenAI models, including those that might be accessed through OpenRouter.


64-77: Enhanced model namespace detection for Google and Anthropic models

Restructured conditions to recognize both direct model names and namespace-prefixed patterns (like "google/gemini" and "anthropic/claude"), which improves icon assignment for models accessed through aggregators like OpenRouter.


81-85: Improved Mistral AI model detection

Added support for the "mistralai/" namespace prefix, ensuring proper icon assignment for Mistral models that may be accessed through OpenRouter.


95-98: Enhanced Grok model detection

Extended the condition to recognize "x-ai/grok" namespace pattern, maintaining consistent icon handling across different provider naming conventions.

README_CN.md (2)

235-237: Clear documentation for OpenRouter model configuration

The instructions clearly explain that OpenRouter models need to be manually configured using the suggested syntax and provide a helpful example that demonstrates how to add an OpenRouter model to the system.


281-288: Well-documented environment variables for OpenRouter

The addition of OPENROUTER_API_KEY and OPENROUTER_URL environment variables follows the same documentation pattern as other providers, making it clear for users how to configure OpenRouter integration.

app/components/settings.tsx (3)

78-78: Clean import addition for OpenRouter.

The import of OpenRouter from constants is properly added alongside other service provider imports.


1363-1402: Well-structured OpenRouter configuration component.

The implementation follows the same pattern as other provider configurations, with proper endpoint URL and API key inputs. The component correctly binds to the access store state and includes appropriate localization for labels and placeholders.


1866-1866: Proper integration of the OpenRouter component.

The component is correctly included in the conditional rendering logic alongside other provider-specific components.

app/locales/da.ts (1)

470-480: Complete and consistent Danish translations for OpenRouter.

The localization entries for OpenRouter follow the same structure and naming conventions as other provider translations, providing all necessary text for the UI components.

app/store/access.ts (5)

20-20: Proper import for OpenRouter base URL.

The OpenRouter base URL constant import is correctly added to the list of imports.


63-63: Correct implementation of default OpenRouter URL.

The default URL is properly set based on whether the application is running in app mode or web mode, consistent with other provider implementations.


138-140: Well-structured state properties for OpenRouter.

The OpenRouter URL and API key properties are properly added to the default state, following the same pattern as other service providers.


229-231: Clean validation method for OpenRouter.

The isValidOpenRouter() method correctly validates the presence of the OpenRouter API key, using the same pattern as other provider validation methods.


252-252: Proper integration with authorization logic.

The OpenRouter validation is correctly added to the isAuthorized() method, ensuring OpenRouter credentials are properly checked during authorization.

app/config/server.ts (3)

91-94: Well-defined environment variables for OpenRouter.

The OpenRouter URL and API key environment variables are properly declared in the ProcessEnv interface, following the same pattern as other providers.


170-170: Correct flag for detecting OpenRouter configuration.

The isOpenRouter flag properly detects if OpenRouter is configured based on the presence of an API key, consistent with the approach used for other providers.


254-257: Complete server-side configuration for OpenRouter.

The OpenRouter configuration is properly added to the server-side config object, with correct handling of the API key through the getApiKey helper function for key rotation/selection.

app/client/api.ts (2)

177-180: OpenRouter provider wiring looks good

The new case ModelProvider.OpenRouter correctly instantiates OpenRouterApi, and the preceding break prevents fall-through.


272-274: isOpenRouter flag is calculated but never used

isOpenRouter is returned from getConfig() and destructured later, yet nothing in the remainder of getHeaders (or the surrounding file) references it. Leaving unused variables tends to trigger linter warnings and adds cognitive load.

If no follow-up code will consume this flag, consider deleting it:

-    const isOpenRouter =
-      modelConfig.providerName === ServiceProvider.OpenRouter;
...
-      isOpenRouter,

[ suggest_nitpick ]

Also applies to: 315-316, 345-345

Comment on lines +69 to +80
headers: {
"Content-Type": "application/json",
Authorization: req.headers.get("Authorization") ?? "",
},
method: req.method,
body: req.body,
redirect: "manual",
// @ts-ignore
duplex: "half",
signal: controller.signal,
};

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Body stream consumed twice – may forward an empty body

  1. fetchOptions.body is first initialised with req.body (a ReadableStream).
  2. Later await req.text() is invoked to inspect/clone the payload, which consumes that stream.
  3. If serverConfig.customModels is falsy, the clonedBody replacement never occurs and the already-consumed stream is forwarded, resulting in an empty request body at OpenRouter.
-  const fetchOptions: RequestInit = {
-    ...
-    body: req.body,
+  const rawBody = await req.text();      // consume once
+  const fetchOptions: RequestInit = {
+    ...
+    body: rawBody.length ? rawBody : undefined,
   ...
-  if (serverConfig.customModels && req.body) {
+  if (serverConfig.customModels && rawBody.length) {
     ...
-      const clonedBody = await req.text();
-      fetchOptions.body = clonedBody;
+      const jsonBody = JSON.parse(rawBody) as { model?: string };

This guarantees the body is read a single time and forwarded intact.

Also applies to: 82-110

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (5)
app/client/platforms/openrouter.ts (5)

182-183: Consider improving type safety instead of using @ts-ignore

There are TypeScript type safety issues being bypassed with @ts-ignore. Consider improving the type definitions to properly handle the tool call arguments array indexing without needing to ignore TypeScript errors.

-                // @ts-ignore
-                runTools[index]["function"]["arguments"] += args;
+                if (runTools[index] && runTools[index].function) {
+                  runTools[index].function.arguments += args;
+                }

224-226: Improve type safety in tool message handling

Multiple @ts-ignore comments indicate potential type safety issues when manipulating the request payload messages. Consider adding appropriate type guards or interfaces to handle this more safely.

-            // @ts-ignore
-            requestPayload?.messages?.splice(
-              // @ts-ignore
-              requestPayload?.messages?.length,
+            if (requestPayload.messages) {
+              requestPayload.messages.splice(
+                requestPayload.messages.length,

274-274: Translate Chinese comment to English

For consistency with the rest of the codebase which is in English, consider translating the Chinese comment to English.

-    let seq = 1000; //同 Constant.ts 中的排序保持一致
+    let seq = 1000; // Keep consistent with the sorting in Constant.ts

116-117: Improve comment clarity about max_tokens exclusion

The comment about excluding max_tokens doesn't clearly explain the reasoning. Consider providing a more informative explanation or linking to documentation that explains this decision.

-      // max_tokens: Math.max(modelConfig.max_tokens, 1024),
-      // Please do not ask me why not send max_tokens, no reason, this param is just shit, I dont want to explain anymore.
+      // Deliberately excluding max_tokens parameter as it can cause unexpected truncation issues with OpenRouter
+      // See: [add reference to relevant documentation or discussion]

120-120: Update log message to reference OpenRouter

The log message references OpenAI but this is the OpenRouter implementation.

-      console.log("[Request] openai payload: ", requestPayload);
+      console.log("[Request] openrouter payload: ", requestPayload);
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2ec8372 and 4bacfe5.

📒 Files selected for processing (1)
  • app/client/platforms/openrouter.ts (1 hunks)
🔇 Additional comments (2)
app/client/platforms/openrouter.ts (2)

43-287: LGTM! Well-structured OpenRouter API implementation

The OpenRouter API implementation follows the established patterns for LLM providers in the codebase. It properly handles custom URLs, authentication, streaming, non-streaming requests, and model listing. The implementation of the thinking/reasoning feature and tool calls also looks correct.


83-246:

✅ Verification successful

Verify OpenRouter API compatibility with the implementation

The implementation assumes OpenRouter API has the same structure as OpenAI for handling streaming responses, tool calls, and reasoning. Please verify this is correct for all supported models through OpenRouter.


🌐 Web query:

Does OpenRouter API support the same streaming format and tool calls structure as OpenAI API?

💡 Result:

OpenRouter does support the same streaming format and tool calls structure as the OpenAI API, making it a highly compatible alternative for developers working with multiple LLM providers.

Streaming Implementation

OpenRouter implements streaming through Server-Sent Events (SSE), just like OpenAI's implementation. To enable streaming, you simply set the stream parameter to true in your request[6][9]. This allows the model to stream the response in chunks rather than returning the entire response at once, which is particularly useful for building chat interfaces where UI should update as the model generates the response[6].

The streaming functionality works with any model available through OpenRouter, making it a versatile feature[6]. Implementation examples include:

Python Example:

import requests
import json

question = "How would you build the tallest building ever?"
url = "https://openrouter.ai/api/v1/chat/completions"
headers = {
    "Authorization": f"Bearer {API_KEY}",
    "Content-Type": "application/json"
}
payload = {
    "model": "openai/chatgpt-4o-latest",
    "messages": [{"role": "user", "content": question}],
    "stream": True
}

with requests.post(url, headers=headers, json=payload, stream=True) as r:
    for chunk in r.iter_content(chunk_size=1024, decode_unicode=True):
        # Process streaming chunks

Tool Calls & Structured Outputs

OpenRouter fully supports OpenAI's tool calling capabilities, allowing for structured outputs across various LLM providers[10]. When working with tool calls:

  1. You must ensure the model you're using supports Tool Calling or Structured Outputs in the OpenRouter models listing[10].
  2. You can use the same interface as you would with OpenAI's API.

For example, using the Instructor library with OpenRouter demonstrates this compatibility:

from openai import OpenAI
import instructor
from pydantic import BaseModel

class User(BaseModel):
    name: str
    age: int

client = OpenAI(
    base_url="https://openrouter.ai/api/v1",
    api_key=os.getenv("OPENROUTER_API_KEY"),
)
client = instructor.from_openai(client, mode=instructor.Mode.TOOLS)

resp = client.chat.completions.create(
    model="google/gemini-2.0-flash-lite-001",
    messages=[{"role": "user", "content": "Ivan is 28 years old"}],
    response_model=User,
    extra_body={"provider": {"require_parameters": True}},
)

API Compatibility

OpenRouter deliberately implements a standardized API compatible with OpenAI's Chat Completions endpoint, normalizing the request and response schema across different model architectures[2]. This provides developers with:

  1. Consistent access to diverse LLMs while maintaining control over model selection
  2. The ability to use OpenRouter as a drop-in replacement for OpenAI[9]
  3. Support for any SDK that works with OpenAI's API[9]

The base URL for all API interactions is 'https://openrouter.ai/api/v1', and the request schema supports the same parameters as OpenAI's API, including temperature control, max tokens, and stop sequences[7].

OpenRouter's compatibility extends to handling advanced features like JSON mode and the ability to work with both text and image inputs, with PDF support coming soon[9][10].

Citations:


OpenRouter streaming and tool calls compatibility verified

OpenRouter’s API is fully compatible with OpenAI’s SSE-based streaming and tool-calling formats across all supported models, so no changes are needed here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant