Fix xgrammar fallback for v0#2155
Merged
wnetelhabana merged 2 commits intoHabanaAI:v1.22.2_nextfrom Nov 28, 2025
Merged
Conversation
Signed-off-by: 12010486 <silvia.colabrese@intel.com>
mgawarkiewicz-intel
approved these changes
Nov 27, 2025
michalkuligowski
approved these changes
Nov 27, 2025
|
/run-gaudi-tests |
PatrykWo
pushed a commit
that referenced
this pull request
Dec 4, 2025
## Issue: When using tool calling on V0, xgrammar is falling back to outlines, and we are unable to handle complex tool calling requests (used for Agentic AI) that are handled instead with Nvidia. It is not a failure, we have: ``` WARNING 11-26 20:25:33 [__init__.py:34] xgrammar does not support advanced JSON schema features like string length, item limits, or property bounds. Falling back to use outlines instead. ``` but the service is not usable - wrong parser. I've adapted this solution: https://github.com/vllm-project/vllm/blob/v0.9.0.1/vllm/v1/structured_output/backend_xgrammar.py#L198 to the `has_xgrammar_unsupported_json_features()` function used in V0. --------- Signed-off-by: 12010486 <silvia.colabrese@intel.com>
wnetelhabana
pushed a commit
that referenced
this pull request
Dec 5, 2025
## Issue: When using tool calling on V0, xgrammar is falling back to outlines, and we are unable to handle complex tool calling requests (used for Agentic AI) that are handled instead with Nvidia. It is not a failure, we have: ``` WARNING 11-26 20:25:33 [__init__.py:34] xgrammar does not support advanced JSON schema features like string length, item limits, or property bounds. Falling back to use outlines instead. ``` but the service is not usable - wrong parser. I've adapted this solution: https://github.com/vllm-project/vllm/blob/v0.9.0.1/vllm/v1/structured_output/backend_xgrammar.py#L198 to the `has_xgrammar_unsupported_json_features()` function used in V0. --------- ## Essential Elements of an Effective PR Description Checklist - [ ] The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)". - [ ] The test plan, such as providing test command. - [ ] The test results, such as pasting the results comparison before and after, or e2e results ## Purpose ## Test Plan ## Test Result <!--- pyml disable-next-line no-emphasis-as-heading --> Signed-off-by: 12010486 <silvia.colabrese@intel.com> Co-authored-by: Silvia Colabrese <silvia.colabrese@intel.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Issue:
When using tool calling on V0, xgrammar is falling back to outlines, and we are unable to handle complex tool calling requests (used for Agentic AI) that are handled instead with Nvidia.
It is not a failure, we have:
but the service is not usable - wrong parser.
I've adapted this solution:
https://github.com/vllm-project/vllm/blob/v0.9.0.1/vllm/v1/structured_output/backend_xgrammar.py#L198
to the
has_xgrammar_unsupported_json_features()function used in V0.