Skip to content

[BUG] Session title generation fails silently since v1.3.3 — effort parameter leaks into small model call #20269

@CTHua

Description

@CTHua

Bug Description

Session title generation has been silently failing since v1.3.3. All new sessions retain their default "New session - <timestamp>" title instead of getting an LLM-generated title.

The root cause: when the user's selected model has a variant that includes an effort parameter (e.g. anthropic/claude-opus-4-6 with variant max), that parameter leaks into the LLM.stream call for the title agent. The title agent resolves to a small model (claude-haiku-4-5-20251001) which does not support the effort parameter, causing a 400 error from the Anthropic API. The error is swallowed by Effect.ignore on the fork, so the failure is completely silent.

Reproduction

  1. Set your model to anthropic/claude-opus-4-6 with variant max (or any variant that maps to output_config.effort)
  2. Start a new session and send a message
  3. Observe the session title remains "New session - <timestamp>"

Evidence from logs

INFO  service=llm providerID=anthropic modelID=claude-haiku-4-5-20251001
      sessionID=ses_xxx small=true agent=title mode=primary stream

ERROR service=llm
      requestBodyValues: {
        "model": "claude-haiku-4-5-20251001",
        "output_config": { "effort": "high" },   ← should not be here
        ...
      }
      responseBody: {
        "type": "error",
        "error": {
          "type": "invalid_request_error",
          "message": "This model does not support the effort parameter."
        }
      }

ERROR service=session.prompt
      error=No output generated. Check the stream for errors.
      failed to generate title

Impact data (from local DB)

Version Sessions with title Sessions without title Success rate
1.3.9 0 4 0%
1.3.5 0 3 0%
1.3.3 0 6 0%
1.3.2 29 2 93.5%
1.3.0 40 3 93.0%
1.2.27 42 2 95.5%

Title generation drops to 0% success starting from v1.3.3 for users with an effort-bearing variant.

Root cause

In packages/opencode/src/session/prompt.ts, the ensureTitle function passes user: firstInfo to LLM.stream:

const result = await LLM.stream({
  agent: ag,
  user: firstInfo,   // ← carries variant from user's main model
  // ...
  model: mdl,        // ← this is the small model (haiku)
})

The user message's variant (e.g. maxeffort: high) is applied to the API call even though the resolved model (claude-haiku-4-5) doesn't support it.

Suggested fix

Strip or ignore the user variant when calling the title agent's small model. For example:

  • Pass user: { ...firstInfo, variant: undefined } in the title generation call
  • Or have LLM.stream skip unsupported parameters when small: true
  • Or catch the specific error and retry without the variant

Workaround

Override the title agent's model in config to one that isn't affected:

// opencode.json agent config
"title": {
  "model": "google/gemini-3-flash"
}

Environment

  • opencode: v1.3.3 → v1.3.9 (all affected)
  • Provider: anthropic (confirmed), likely affects any provider where variant maps to unsupported model params
  • OS: macOS (arm64)

Metadata

Metadata

Assignees

Labels

coreAnything pertaining to core functionality of the application (opencode server stuff)

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions