Skip to content

langchain-ollama (partners) / langchain-core: allow passing ChatMessages to Ollama (including arbitrary roles)#30411

Merged
ccurme (ccurme) merged 14 commits intolangchain-ai:masterfrom
rylativity:allow-arbitrary-roles-ollama
Apr 18, 2025
Merged

langchain-ollama (partners) / langchain-core: allow passing ChatMessages to Ollama (including arbitrary roles)#30411
ccurme (ccurme) merged 14 commits intolangchain-ai:masterfrom
rylativity:allow-arbitrary-roles-ollama

Conversation

@rylativity
Copy link
Copy Markdown
Contributor

@rylativity rylativity commented Mar 21, 2025

Replacement for PR #30191 (ccurme (@ccurme))

Description: currently, ChatOllama will raise a value error if a ChatMessage is passed to it, as described #30147 (comment).

Furthermore, ollama-python is removing the limitations on valid roles that can be passed through chat messages to a model in ollama - ollama/ollama-python#462 (comment).

This PR removes the role limitations imposed by langchain and enables passing langchain ChatMessages with arbitrary 'role' values through the langchain ChatOllama class to the underlying ollama-python Client.

As this PR relies on merged but unreleased functionality in ollama-python, I have temporarily pointed the ollama package source to the main branch of the ollama-python github repo.

Format, lint, and tests of new functionality passing. Need to resolve issue with recently added ChatOllama tests. (Now resolved)

Issue: resolves #30122 (related to ollama issue ollama/ollama#8955)

Dependencies: no new dependencies

[x] PR title
[x] PR message
[x] Lint and test: format, lint, and test all running successfully and passing

@vercel
Copy link
Copy Markdown

vercel bot commented Mar 21, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchain ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 18, 2025 2:02pm

…or and enable future compatibility for strict mode
@rylativity rylativity marked this pull request as ready for review March 21, 2025 13:54
@dosubot dosubot bot added size:M labels Mar 21, 2025
@dosubot dosubot bot added size:L and removed size:M labels Mar 21, 2025
@rylativity
Copy link
Copy Markdown
Contributor Author

ccurme (@ccurme) The change to libs/standard-tests/langchain_tests/unit_tests/chat_models.py in #30385 on 3/20 causes ChatOllama to fail two of the shared chatmodel tests (example test failure - https://github.com/langchain-ai/langchain/actions/runs/13985107639/job/39157371264). Even on the master branch, these tests fail in for ChatOllama (can you confirm?).

I've updated ChatOllama's with_structured_output method to accept the strict param, but I don't believe ollama actually does anything with that param currently (so I noted this in the docstring).

I have added the tests and example for using ChatMessages (optionally with arbitrary roles) with ChatOllama in the API reference.

As mentioned in the PR opening comment, this relies on functionality in ollama-python that is merged into main but not yet in a versioned release. We will need to revert the change to libs/partners/ollama/pyproject.toml before merging, but for now I've left it in for the CI/CD checks.

@edwinjosechittilappilly
Copy link
Copy Markdown

Any updates on this PR?

…ils to create a ChatMessage for messages with unexpected message_types and log warning
@rylativity rylativity changed the title langchain-ollama (partners): allow passing ChatMessages to Ollama (including arbitrary roles) langchain-ollama (partners) / langchain-core: allow passing ChatMessages to Ollama (including arbitrary roles) Apr 17, 2025
@rylativity
Copy link
Copy Markdown
Contributor Author

The required update in ollama's python client was released yesterday - https://github.com/ollama/ollama-python/releases/tag/v0.4.8

I believe this is good to go - the most recent CI failures above appear to be unrelated to these changes, as they only occur intermittently (i.e. run the tests multiple times and they will pass sometimes and fail others) and the same intermittent failures appear to occur on the master branch too.

ccurme (@ccurme) will you take a look at this and let me know if I need to make any additional changes or if this is ready to merge?

"source": [
"## ChatOllama\n",
"\n",
"In some cases, you will need a model with a \"Chat\" interface. (for more info see - https://python.langchain.com/docs/concepts/chat_models/)."
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should update the page dedicated to ChatOllama (docs/integrations/chat/ollama.ipynb) instead of this.

Separately, this is a nitpick but we should basically be using chat models in all cases. Text completion interfaces are a legacy interface at this point (see the admonition at the top of the page).

msg = (
f"Unexpected message type: '{message_type}'. Use one of 'human',"
f" 'user', 'ai', 'assistant', 'function', 'tool', 'system', or 'developer'."
warning_msg = (
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function is unused in langchain-ollama, why are we updating it in core?

We can't make changes to core without tests, documented motivation, understanding impact to all LangChain integrations, etc.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also discussed here - #30191 (comment)

This function is used by ChatOllama if the messages passed to chat ollama are dicts instead of ChatMessages (or other defined message classes).

For example, without updating how core handles converting dicts to defined message classes, this will not work because the value of the "role" field is not one of the standard roles accepted in core's _create_message_from_message_type util function...

from langchain_ollama import ChatOllama
from langchain_core.messages import ChatMessage

llm = ChatOllama(
        base_url    = "http://localhost:11434",
        model       = "granite3.2",
        verbose     = True,
        disable_streaming = True,
    )

messages = [
        {"role":"control","content":"thinking"},
        {"role":"user","content":"explain options trading"},
    ]

print(llm.invoke(messages))
  File ".../temp_test/test_implementation.py", line 16, in <module>
    print(llm.invoke(messages))
  File ".../langchain/libs/core/langchain_core/language_models/chat_models.py", line 369, in invoke
    [self._convert_input(input)],
  File ".../langchain/libs/core/langchain_core/language_models/chat_models.py", line 349, in _convert_input
    return ChatPromptValue(messages=convert_to_messages(input))
  File ".../langchain/libs/core/langchain_core/messages/utils.py", line 363, in convert_to_messages
    return [_convert_to_message(m) for m in messages]
  File ".../langchain/libs/core/langchain_core/messages/utils.py", line 363, in <listcomp>
    return [_convert_to_message(m) for m in messages]
  File ".../langchain/libs/core/langchain_core/messages/utils.py", line 336, in _convert_to_message
    _message = _create_message_from_message_type(
  File ".../langchain/libs/core/langchain_core/messages/utils.py", line 288, in _create_message_from_message_type
    raise ValueError(msg)
ValueError: Unexpected message type: 'control'. Use one of 'human', 'user', 'ai', 'assistant', 'function', 'tool', 'system', or 'developer'.
For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/MESSAGE_COERCION_FAILURE 

But this will work:

from langchain_ollama import ChatOllama
from langchain_core.messages import ChatMessage

llm = ChatOllama(
        base_url    = "http://localhost:11434",
        model       = "granite3.2",
        verbose     = True,
        disable_streaming = True,
    )

messages = [
        {"role":"control","type":"chat","content":"thinking"},
        {"role":"user","content":"explain options trading"},
    ]
messages = [ChatMessage(**m) for m in messages]
print(llm.invoke(messages))

Comment thread libs/partners/ollama/langchain_ollama/chat_models.py Outdated
Copy link
Copy Markdown
Collaborator

@ccurme ccurme (ccurme) left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this!

I made some changes before merging, please let me know if anything looks wrong and we can take another look. Otherwise we can release.

@dosubot dosubot bot added the lgtm label Apr 18, 2025
@ccurme ccurme (ccurme) merged commit dbf9986 into langchain-ai:master Apr 18, 2025
22 checks passed
@rylativity
Copy link
Copy Markdown
Contributor Author

For the reasons discussed in my response comment above - #30411 (comment) - I don't think this will resolve #30122 without the changes in core unless clients/callers are required to explicitly pass ChatMessage objects (as opposed to dicts) when using custom or non-standard roles

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Granite 3.2 Thinking

3 participants