Skip to content

Feature Request: BYOK (Bring Your Own Key/Model) support for GDevelop AI Agent #7932

@olixis

Description

@olixis

Feature Request: BYOK (Bring Your Own Key/Model) support for GDevelop AI Agent

BEFORE opening a new feature request, I’ve discussed this with other users and searched the forum/roadmap.
AFTER opening: OK to close if tracked on the public roadmap.

Description

Problem:
I’m often blocked by the current fixed set of AI providers/models in the AI Agent. I need to control cost, privacy/compliance (e.g., data residency), and choose models best suited for my project (e.g., code-gen vs. narrative design). Without BYOK, I can’t use my existing contracts (Azure OpenAI, OpenRouter, Anthropic, Google, etc.) or local models (Ollama/LM Studio). This limits experimentation and sometimes makes the AI Agent unusable in restricted environments.

Solution suggested

Describe the solution:
Add BYOK (Bring Your Own Key/Model) support so users can plug in any compatible LLM endpoint and credentials.

Proposed capabilities:

  • Multiple Providers: Support OpenAI-compatible APIs (OpenAI/Azure OpenAI), Anthropic, Google Generative AI, OpenRouter, plus a generic HTTP/OpenAI-compatible adapter (Mistral, llama.cpp servers, Ollama, LM Studio).
  • Config UI:
    • Global Settings → “AI Providers” with: Provider, API base URL, API key/headers, default model, max tokens, temperature, safety options.
    • Per-project override and per-tool (e.g., “AI Code Assistant,” “Dialogue Writer”) model selection.
  • Local/Offline: Allow specifying a local base URL (e.g., http://localhost:11434/v1) for on-device models, with performance/memory warnings.
  • Quota & Cost Controls: Soft token limits per action/session, usage counters, and optional cost estimates for supported providers.
  • Safety & Privacy: Toggles for sending project data; redaction options; clear notes about provider ToS and what data leaves GDevelop.
  • Backwards Compatibility: If no custom provider is set, fall back to the current default.

Add any other context or screenshots about the feature request here.
I can help test the OpenAI-compatible and Ollama flows. If helpful, I can draft a small provider interface (TypeScript) and example adapters.

Alternatives considered

  • Requesting more first-party providers one by one: Doesn’t scale and still blocks local/offline use.
  • External scripts to call models outside GDevelop: Loses integrated UX (prompts, project context, guardrails).
  • Only supporting a single marketplace (e.g., OpenRouter only): Better than nothing, but excludes enterprise Azure setups and fully local models.

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions