Problem
When using @ai-sdk/openai-compatible provider (e.g. LiteLLM proxy) with GPT-5.x models, opencode auto-injects reasoningSummary: "auto" into provider options. LiteLLM does not recognize this parameter and returns:
litellm.BadRequestError: OpenAIException - Unknown parameter: 'reasoningSummary'
Root Cause
In packages/opencode/src/provider/transform.ts, the options() function checks input.model.api.id.includes("gpt-5") and unconditionally sets reasoningSummary: "auto" regardless of the npm provider package:
if (input.model.api.id.includes("gpt-5") && !input.model.api.id.includes("gpt-5-chat")) {
if (!input.model.api.id.includes("gpt-5-pro")) {
result["reasoningEffort"] = "medium";
result["reasoningSummary"] = "auto"; // <-- injected for ALL providers
}
}
The variants() function correctly scopes reasoningSummary to only @ai-sdk/openai, @ai-sdk/azure, and @ai-sdk/github-copilot — but options() does not.
Why it can't be worked around via config
- Setting
"reasoningSummary": null in model options does not help — mergeDeep (remeda) preserves null, and @ai-sdk/openai-compatible sends it in the request body.
- Setting
"reasoningSummary": undefined does not help — mergeDeep skips undefined, so the base value "auto" remains.
- There is no config-level way to delete a key injected by the base
options().
Expected Behavior
reasoningSummary should only be injected for providers that support it (@ai-sdk/openai, @ai-sdk/azure, @ai-sdk/github-copilot), consistent with how variants() already handles this.
Suggested Fix
Wrap the reasoningSummary assignment in options() with a provider check:
if (["@ai-sdk/openai", "@ai-sdk/azure", "@ai-sdk/github-copilot"].includes(input.model.api.npm)) {
result["reasoningSummary"] = "auto";
}
Config
{
"provider": {
"litellm": {
"npm": "@ai-sdk/openai-compatible",
"models": {
"gpt-5.4": {
"name": "gpt-5.4",
"limit": { "context": 1050000, "output": 128000 }
}
}
}
}
}
Environment
- opencode v1.4.3 (Homebrew, compiled Bun binary)
- LiteLLM proxy as OpenAI-compatible backend
Problem
When using
@ai-sdk/openai-compatibleprovider (e.g. LiteLLM proxy) with GPT-5.x models, opencode auto-injectsreasoningSummary: "auto"into provider options. LiteLLM does not recognize this parameter and returns:Root Cause
In
packages/opencode/src/provider/transform.ts, theoptions()function checksinput.model.api.id.includes("gpt-5")and unconditionally setsreasoningSummary: "auto"regardless of the npm provider package:The
variants()function correctly scopesreasoningSummaryto only@ai-sdk/openai,@ai-sdk/azure, and@ai-sdk/github-copilot— butoptions()does not.Why it can't be worked around via config
"reasoningSummary": nullin modeloptionsdoes not help —mergeDeep(remeda) preservesnull, and@ai-sdk/openai-compatiblesends it in the request body."reasoningSummary": undefineddoes not help —mergeDeepskipsundefined, so the base value"auto"remains.options().Expected Behavior
reasoningSummaryshould only be injected for providers that support it (@ai-sdk/openai,@ai-sdk/azure,@ai-sdk/github-copilot), consistent with howvariants()already handles this.Suggested Fix
Wrap the
reasoningSummaryassignment inoptions()with a provider check:Config
{ "provider": { "litellm": { "npm": "@ai-sdk/openai-compatible", "models": { "gpt-5.4": { "name": "gpt-5.4", "limit": { "context": 1050000, "output": 128000 } } } } } }Environment