Skip to content

[FEATURE]: Auto-load available models from LiteLLM proxy with autoload option #13891

@farukcankaya

Description

@farukcankaya

Feature hasn't been suggested before.

  • I have verified this feature I'm about to request hasn't been suggested before.

Describe the enhancement you want to request

Problem

When using OpenCode with a LiteLLM proxy, users must manually define every model in opencode.json. LiteLLM proxies already expose all available models via their /models endpoint (OpenAI-compatible). If the proxy has 20+ models, the config becomes tedious to maintain and goes stale as models are added/removed on the proxy side.

Proposed solution

Add an autoload: true option that works alongside litellmProxy: true. When both are set, OpenCode fetches the list of available models from the proxy's /models endpoint at startup. When only litellmProxy: true is set (without autoload), models must be defined manually as before.

Manual models only (litellmProxy: true):

{
  "provider": {
    "MyLiteLLMProxy": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "My LiteLLM Proxy",
      "options": {
        "baseURL": "https://litellm.example.com/v1",
        "litellmProxy": true
      },
      "models": {
        "gpt-4": { "name": "GPT-4" },
        "anthropic/claude-opus-4-6": { "name": "anthropic/claude-opus-4-6" },
        "deepseek-chat": { "name": "DeepSeek Chat" }
      }
    }
  }
}

Auto-load all models (litellmProxy: true + autoload: true):

{
  "provider": {
    "my-proxy": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "My LiteLLM Proxy",
      "options": {
        "baseURL": "https://litellm.example.com/v1",
        "litellmProxy": true,
        "autoload": true,
        "apiKey": "sk-key"
      }
    }
  }
}

Key behaviors:

  • litellmProxy: true marks the provider as a LiteLLM proxy
  • autoload: true opts into fetching models from the /models endpoint at startup
  • Both must be set for auto-loading (or provider ID contains "litellm" + autoload: true)
  • Manually configured models are never overridden (user config takes precedence)

Why this belongs in OpenCode

  • "Support for new providers" is listed as an accepted contribution type in CONTRIBUTING.md
  • LiteLLM is the most common proxy for teams running multiple LLM providers behind a single gateway
  • This removes friction for self-hosted and enterprise setups where model lists change frequently
  • Zero impact on existing users - autoload is opt-in alongside litellmProxy

Metadata

Metadata

Assignees

Labels

discussionUsed for feature requests, proposals, ideas, etc. Open discussion

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions