langchain-ollama (partners) / langchain-core: allow passing ChatMessages to Ollama (including arbitrary roles)#30411
Conversation
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
…or and enable future compatibility for strict mode
|
ccurme (@ccurme) The change to libs/standard-tests/langchain_tests/unit_tests/chat_models.py in #30385 on 3/20 causes ChatOllama to fail two of the shared chatmodel tests (example test failure - https://github.com/langchain-ai/langchain/actions/runs/13985107639/job/39157371264). Even on the master branch, these tests fail in for ChatOllama (can you confirm?). I've updated ChatOllama's I have added the tests and example for using ChatMessages (optionally with arbitrary roles) with ChatOllama in the API reference. As mentioned in the PR opening comment, this relies on functionality in ollama-python that is merged into main but not yet in a versioned release. We will need to revert the change to libs/partners/ollama/pyproject.toml before merging, but for now I've left it in for the CI/CD checks. |
|
Any updates on this PR? |
…ils to create a ChatMessage for messages with unexpected message_types and log warning
|
The required update in ollama's python client was released yesterday - https://github.com/ollama/ollama-python/releases/tag/v0.4.8 I believe this is good to go - the most recent CI failures above appear to be unrelated to these changes, as they only occur intermittently (i.e. run the tests multiple times and they will pass sometimes and fail others) and the same intermittent failures appear to occur on the master branch too. ccurme (@ccurme) will you take a look at this and let me know if I need to make any additional changes or if this is ready to merge? |
| "source": [ | ||
| "## ChatOllama\n", | ||
| "\n", | ||
| "In some cases, you will need a model with a \"Chat\" interface. (for more info see - https://python.langchain.com/docs/concepts/chat_models/)." |
There was a problem hiding this comment.
We should update the page dedicated to ChatOllama (docs/integrations/chat/ollama.ipynb) instead of this.
Separately, this is a nitpick but we should basically be using chat models in all cases. Text completion interfaces are a legacy interface at this point (see the admonition at the top of the page).
| msg = ( | ||
| f"Unexpected message type: '{message_type}'. Use one of 'human'," | ||
| f" 'user', 'ai', 'assistant', 'function', 'tool', 'system', or 'developer'." | ||
| warning_msg = ( |
There was a problem hiding this comment.
This function is unused in langchain-ollama, why are we updating it in core?
We can't make changes to core without tests, documented motivation, understanding impact to all LangChain integrations, etc.
There was a problem hiding this comment.
Also discussed here - #30191 (comment)
This function is used by ChatOllama if the messages passed to chat ollama are dicts instead of ChatMessages (or other defined message classes).
For example, without updating how core handles converting dicts to defined message classes, this will not work because the value of the "role" field is not one of the standard roles accepted in core's _create_message_from_message_type util function...
from langchain_ollama import ChatOllama
from langchain_core.messages import ChatMessage
llm = ChatOllama(
base_url = "http://localhost:11434",
model = "granite3.2",
verbose = True,
disable_streaming = True,
)
messages = [
{"role":"control","content":"thinking"},
{"role":"user","content":"explain options trading"},
]
print(llm.invoke(messages)) File ".../temp_test/test_implementation.py", line 16, in <module>
print(llm.invoke(messages))
File ".../langchain/libs/core/langchain_core/language_models/chat_models.py", line 369, in invoke
[self._convert_input(input)],
File ".../langchain/libs/core/langchain_core/language_models/chat_models.py", line 349, in _convert_input
return ChatPromptValue(messages=convert_to_messages(input))
File ".../langchain/libs/core/langchain_core/messages/utils.py", line 363, in convert_to_messages
return [_convert_to_message(m) for m in messages]
File ".../langchain/libs/core/langchain_core/messages/utils.py", line 363, in <listcomp>
return [_convert_to_message(m) for m in messages]
File ".../langchain/libs/core/langchain_core/messages/utils.py", line 336, in _convert_to_message
_message = _create_message_from_message_type(
File ".../langchain/libs/core/langchain_core/messages/utils.py", line 288, in _create_message_from_message_type
raise ValueError(msg)
ValueError: Unexpected message type: 'control'. Use one of 'human', 'user', 'ai', 'assistant', 'function', 'tool', 'system', or 'developer'.
For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/MESSAGE_COERCION_FAILURE
But this will work:
from langchain_ollama import ChatOllama
from langchain_core.messages import ChatMessage
llm = ChatOllama(
base_url = "http://localhost:11434",
model = "granite3.2",
verbose = True,
disable_streaming = True,
)
messages = [
{"role":"control","type":"chat","content":"thinking"},
{"role":"user","content":"explain options trading"},
]
messages = [ChatMessage(**m) for m in messages]
print(llm.invoke(messages))
ccurme (ccurme)
left a comment
There was a problem hiding this comment.
Thanks for this!
I made some changes before merging, please let me know if anything looks wrong and we can take another look. Otherwise we can release.
|
For the reasons discussed in my response comment above - #30411 (comment) - I don't think this will resolve #30122 without the changes in core unless clients/callers are required to explicitly pass ChatMessage objects (as opposed to dicts) when using custom or non-standard roles |
Replacement for PR #30191 (ccurme (@ccurme))
Description: currently, ChatOllama will raise a value error if a ChatMessage is passed to it, as described #30147 (comment).
Furthermore, ollama-python is removing the limitations on valid roles that can be passed through chat messages to a model in ollama - ollama/ollama-python#462 (comment).
This PR removes the role limitations imposed by langchain and enables passing langchain ChatMessages with arbitrary 'role' values through the langchain ChatOllama class to the underlying ollama-python Client.
As this PR relies on merged but unreleased functionality in ollama-python, I have temporarily pointed the ollama package source to the main branch of the ollama-python github repo.
Format, lint, and tests of new functionality passing. Need to resolve issue with recently added ChatOllama tests. (Now resolved)
Issue: resolves #30122 (related to ollama issue ollama/ollama#8955)
Dependencies: no new dependencies
[x] PR title
[x] PR message
[x] Lint and test: format, lint, and test all running successfully and passing