Conversation
AndreiCautisanu
approved these changes
Apr 9, 2026
AndreiCautisanu
approved these changes
Apr 9, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Details
Adds two automated behaviors to the daily provider model sync workflow:
sync_provider_models.pynow also regeneratesllm-models-default.yaml(the runtime registry consumed byLlmModelRegistryService) alongside the existing Java enum and TypeScript files. This ensures the CDN-hosted YAML stays in sync with every model update. Reasoning flags are carried over from the existing file; all other properties (structuredOutput,qualifiedNamefor VertexAI) are derived from the same sources as the Java enums.s3://cdn.comet.ml/opik/llm-models-default.yamland the CloudFront distribution is invalidated so consumers pick up the new file within the configured TTL.Change checklist
Issues
AI-WATERMARK
AI-WATERMARK: yes
regenerate_llm_models_yaml()function in sync script + workflow S3/CloudFront stepsllm-models-default.yamlschema; verify AWS credential/role/bucket/distribution variable names match infra setupTesting
python scripts/sync_provider_models.py --dry-run— passes, no errorspython -c "from scripts/sync_provider_models import regenerate_llm_models_yaml ..."— verified correct YAML output:structuredOutput,reasoning, and VertexAIqualifiedNamefields all render correctly; empty provider sections emit[](valid YAML, avoids null deserialization in Jackson)Documentation
N/A