Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Add reponse format parameter to LLM chat completion call#234

Merged
ptelang merged 1 commit intomainfrom
add-response-format
Dec 9, 2024
Merged

Add reponse format parameter to LLM chat completion call#234
ptelang merged 1 commit intomainfrom
add-response-format

Conversation

@ptelang
Copy link
Copy Markdown
Contributor

@ptelang ptelang commented Dec 9, 2024

Setting response format parameter will ensure that the LLM response is JSON.

@ptelang ptelang merged commit d666bb6 into main Dec 9, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants