Skip to content

[Bug] Vertex AI: thinkingConfig.includeThoughts sent unconditionally, breaks non-thinking Gemini models (2.0-flash, 2.5-flash-lite, etc.) #18243

@uncle-rennus

Description

@uncle-rennus

Description

Describe the bug

When using @ai-sdk/google-vertex as the provider, OpenCode unconditionally adds thinkingConfig: { includeThoughts: true } to every request, regardless of whether the selected model supports thinking. This causes a Vertex AI 400 error for models that don't support it:

Unable to submit request because Thinking_config.include_thoughts is only enabled when thinking is enabled.

Root cause (exact file + lines)

In packages/opencode/src/provider/transform.ts, inside the options() export function:

if (input.model.api.npm === "@ai-sdk/google" || input.model.api.npm === "@ai-sdk/google-vertex") {
  result["thinkingConfig"] = {
    includeThoughts: true,
  }
  if (input.model.api.id.includes("gemini-3")) {
    result["thinkingConfig"]["thinkingLevel"] = "high"
  }
}

This block runs for every Gemini model on Google/Vertex, with no check for whether the model's capabilities.reasoning is true. So gemini-2.0-flash, gemini-2.5-flash-lite, and any other non-thinking model all get includeThoughts: true injected — which Vertex rejects.

Proposed fix

Gate it on capabilities.reasoning:

if (input.model.api.npm === "@ai-sdk/google" || input.model.api.npm === "@ai-sdk/google-vertex") {
  if (input.model.capabilities.reasoning) {
    result["thinkingConfig"] = {
      includeThoughts: true,
    }
    if (input.model.api.id.includes("gemini-3")) {
      result["thinkingConfig"]["thinkingLevel"] = "high"
    }
  }
}

Affected models on Vertex

  • gemini-2.0-flash / gemini-2.0-flash-001
  • gemini-2.5-flash-lite-preview-0514
  • Any Gemini model without capabilities.reasoning = true

Working models (not affected)

  • gemini-2.5-pro (has capabilities.reasoning, thinking works fine)

Note on smallOptions()

The smallOptions() function handles the google providerID correctly (sends thinkingBudget: 0 to suppress thinking), but this fix is never applied to Vertex because smallOptions() only checks model.providerID === "google", not @ai-sdk/google-vertex.

Plugins

N/A

OpenCode version

1.2.10

Steps to reproduce

  1. Configure OpenCode with Google Vertex AI provider + service account (GOOGLE_APPLICATION_CREDENTIALS)
  2. Select any non-thinking model: gemini-2.0-flash, gemini-2.5-flash-lite-preview-0514, etc.
  3. Send any message
  4. → Vertex returns 400: Thinking_config.include_thoughts is only enabled when thinking is enabled

Screenshot and/or share link

No response

Operating System

Windows 11

Terminal

Windows Terminal

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingcoreAnything pertaining to core functionality of the application (opencode server stuff)

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions