Skip to content

Add AsyncOllama to __all__ exports#1791

Merged
RobinPicard merged 1 commit intodottxt-ai:mainfrom
Anri-Lombard:add-async-exports
Dec 5, 2025
Merged

Add AsyncOllama to __all__ exports#1791
RobinPicard merged 1 commit intodottxt-ai:mainfrom
Anri-Lombard:add-async-exports

Conversation

@Anri-Lombard
Copy link
Contributor

The AsyncOllama class was missing from the module's all list. Spotted this while working on lmstudio integration.

The AsyncOllama class was missing from the module's __all__ list,
making it inaccessible via 'from outlines.models.ollama import *'.
All other async model classes (AsyncVLLM, AsyncOpenAI, AsyncTGI,
AsyncMistral, AsyncSGLang) are properly exported.
Copy link
Contributor

@RobinPicard RobinPicard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@RobinPicard RobinPicard merged commit f9c72dd into dottxt-ai:main Dec 5, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants