Skip to content

reasoningSummary injected for gpt-5 models on @ai-sdk/openai-compatible providers causes litellm errors #22350

@nazarhnatyshen

Description

@nazarhnatyshen

Problem

When using @ai-sdk/openai-compatible provider (e.g. LiteLLM proxy) with GPT-5.x models, opencode auto-injects reasoningSummary: "auto" into provider options. LiteLLM does not recognize this parameter and returns:

litellm.BadRequestError: OpenAIException - Unknown parameter: 'reasoningSummary'

Root Cause

In packages/opencode/src/provider/transform.ts, the options() function checks input.model.api.id.includes("gpt-5") and unconditionally sets reasoningSummary: "auto" regardless of the npm provider package:

if (input.model.api.id.includes("gpt-5") && !input.model.api.id.includes("gpt-5-chat")) {
  if (!input.model.api.id.includes("gpt-5-pro")) {
    result["reasoningEffort"] = "medium";
    result["reasoningSummary"] = "auto";  // <-- injected for ALL providers
  }
}

The variants() function correctly scopes reasoningSummary to only @ai-sdk/openai, @ai-sdk/azure, and @ai-sdk/github-copilot — but options() does not.

Why it can't be worked around via config

  • Setting "reasoningSummary": null in model options does not help — mergeDeep (remeda) preserves null, and @ai-sdk/openai-compatible sends it in the request body.
  • Setting "reasoningSummary": undefined does not help — mergeDeep skips undefined, so the base value "auto" remains.
  • There is no config-level way to delete a key injected by the base options().

Expected Behavior

reasoningSummary should only be injected for providers that support it (@ai-sdk/openai, @ai-sdk/azure, @ai-sdk/github-copilot), consistent with how variants() already handles this.

Suggested Fix

Wrap the reasoningSummary assignment in options() with a provider check:

if (["@ai-sdk/openai", "@ai-sdk/azure", "@ai-sdk/github-copilot"].includes(input.model.api.npm)) {
  result["reasoningSummary"] = "auto";
}

Config

{
  "provider": {
    "litellm": {
      "npm": "@ai-sdk/openai-compatible",
      "models": {
        "gpt-5.4": {
          "name": "gpt-5.4",
          "limit": { "context": 1050000, "output": 128000 }
        }
      }
    }
  }
}

Environment

  • opencode v1.4.3 (Homebrew, compiled Bun binary)
  • LiteLLM proxy as OpenAI-compatible backend

Metadata

Metadata

Assignees

Labels

coreAnything pertaining to core functionality of the application (opencode server stuff)

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions