Skip to content

fix: support Ollama 'reasoning' field alongside 'reasoning_content'#792

Open
Br1an67 wants to merge 1 commit intoQwenLM:mainfrom
Br1an67:fix/ollama-reasoning-field
Open

fix: support Ollama 'reasoning' field alongside 'reasoning_content'#792
Br1an67 wants to merge 1 commit intoQwenLM:mainfrom
Br1an67:fix/ollama-reasoning-field

Conversation

@Br1an67
Copy link

@Br1an67 Br1an67 commented Mar 1, 2026

Problem

When using Qwen3 models via Ollama, thinking content arrives in a reasoning field on the streaming chunk delta. However oai.py only checks for reasoning_content, so thinking content from Ollama is silently discarded and full_reasoning_content is never populated.

This also breaks the agentic loop when thought_in_content=True, because the postprocessor waits for </think> tags that never appear.

Fix

Use getattr with fallback to check both reasoning_content (DashScope/DeepSeek) and reasoning (Ollama) field names in all three code paths:

  • Delta stream mode
  • Non-delta stream mode
  • Non-stream mode

Ollama chunk format

ChoiceDelta(content='', reasoning='the model thinking here...')

vs DashScope/DeepSeek:

ChoiceDelta(content='', reasoning_content='the model thinking here...')

Fixes #789

…ming

Ollama returns thinking content in a 'reasoning' field on the streaming
chunk delta, while DashScope/DeepSeek-style APIs use 'reasoning_content'.
The current code only checks 'reasoning_content', causing thinking content
from Ollama to be silently discarded.

Use getattr with fallback to check both field names in all three code
paths (delta_stream, non-delta stream, and non-stream).

Fixes QwenLM#789
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ollama streaming chunks use reasoning field not reasoning_content — thinking content silently lost with Qwen3 models

1 participant