Skip to content

fix(llm): surface contextual errors for Ollama chat failures#253

Open
Gujiassh wants to merge 1 commit intoFujiwaraChoki:mainfrom
Gujiassh:fix/ollama-error-context
Open

fix(llm): surface contextual errors for Ollama chat failures#253
Gujiassh wants to merge 1 commit intoFujiwaraChoki:mainfrom
Gujiassh:fix/ollama-error-context

Conversation

@Gujiassh
Copy link
Copy Markdown

@Gujiassh Gujiassh commented Apr 11, 2026

Summary

  • wrap ollama.Client.chat failures in generate_text with a descriptive runtime error
  • include the active Ollama base URL and model name in the error message
  • fail clearly when Ollama responses are missing message.content
  • add regression tests for success path and both error paths

Testing

  • pytest -q tests/test_llm_provider.py

Fixes #232.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

fix(llm_provider): Ollama calls lack error handling

1 participant