Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Commit 328c048

Browse files
committed
Pass along the raw parameter
The `raw` parameter tells the LLM to never use natural language, but just reply in the format of the message. We need to pass that to the generate call or else we migth get garbage back to the client.
1 parent 1100547 commit 328c048

File tree

2 files changed

+2
-0
lines changed

2 files changed

+2
-0
lines changed

src/codegate/providers/ollama/completion_handler.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -100,6 +100,7 @@ async def execute_completion(
100100
response = await self.client.generate(
101101
model=request["model"],
102102
prompt=prompt,
103+
raw=request.get("raw", False),
103104
suffix=request.get("suffix", ""),
104105
stream=stream,
105106
options=request["options"], # type: ignore

tests/providers/ollama/test_ollama_completion_handler.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,7 @@ async def test_execute_completion_is_fim_request(handler, chat_request):
4141
stream=False,
4242
options=chat_request["options"],
4343
suffix="",
44+
raw=False,
4445
)
4546

4647

0 commit comments

Comments
 (0)