Skip to content

fix: handle empty OpenAI-compatible text responses#1218

Open
andyluo7 wants to merge 1 commit intoEvolvingLMMs-Lab:mainfrom
andyluo7:openai-compatible-none-content-fix
Open

fix: handle empty OpenAI-compatible text responses#1218
andyluo7 wants to merge 1 commit intoEvolvingLMMs-Lab:mainfrom
andyluo7:openai-compatible-none-content-fix

Conversation

@andyluo7
Copy link

@andyluo7 andyluo7 commented Mar 2, 2026

Summary

Harden the OpenAI-compatible evaluation path so None or structured message.content values do not break lmms-eval postprocessing.

Changes

  • normalize OpenAI-compatible message.content so None becomes an empty string and list-style structured content is flattened into plain text
  • guard unwrap_generation_output() against raw None and GenerationResult(text=None)
  • guard ConfigurableTask.process_results() so generate-until results are coerced to strings before .strip()

Why

Some OpenAI-compatible backends can legally return message.content = None or structured content blocks. Previously those values propagated into task postprocessing and caused crashes like:

AttributeError: 'NoneType' object has no attribute 'strip'

Validation

  • verified the patched files parse cleanly
  • confirmed OCRBench evaluation completes successfully after this fix

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant