fix: pass both 'openai' and 'azure' providerOptions keys for @ai-sdk/azure#20272
fix: pass both 'openai' and 'azure' providerOptions keys for @ai-sdk/azure#20272meruiden wants to merge 1 commit intoanomalyco:devfrom
Conversation
|
Thanks for your contribution! This PR doesn't have a linked issue. All PRs must reference an existing issue. Please:
See CONTRIBUTING.md for details. |
|
Thanks for updating your PR! It now meets our contributing guidelines. 👍 |
…azure `@ai-sdk/azure` delegates to `OpenAIChatLanguageModel` from `@ai-sdk/openai`, which hardcodes `provider: 'openai'` when calling `parseProviderOptions` — so it only reads model options from `providerOptions["openai"]`. Meanwhile, `OpenAIResponsesLanguageModel` checks `providerOptions["azure"]` first, falling back to `"openai"`. Previously, `providerOptions()` only passed options under the `"azure"` key (via `sdkKey`), which meant model options like `reasoningEffort` were silently ignored on the chat completions path. Fix: for `@ai-sdk/azure`, pass options under both `"openai"` and `"azure"` keys so they are picked up by both the Chat and Responses model implementations. This avoids changing `sdkKey()`, which is also used for message-level providerOptions remapping where `"azure"` is the correct key. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
5e6a477 to
ae2ec36
Compare
|
@rekram1-node Updated the approach based on your feedback on #20275. The original fix changed New approach: keep
Rebased on latest |
Issue for this PR
Closes #20275
Type of change
What does this PR do?
When
providerOptions()intransform.tsbuilds model-level options for@ai-sdk/azure, it wraps them under the"azure"key (viasdkKey()). However,@ai-sdk/azuredelegates toOpenAIChatLanguageModelfrom@ai-sdk/openai, which hardcodesprovider: 'openai'inparseProviderOptions— so it only reads fromproviderOptions["openai"]. Model options likereasoningEffortend up under the wrong key and are silently ignored.The
OpenAIResponsesLanguageModelis smarter — it checks"azure"first, then falls back to"openai". So the responses path wasn't affected, only the chat completions path.Fix: For
@ai-sdk/azure, pass model options under both"openai"and"azure"keys. This ensures both the Chat and Responses model implementations pick them up.This approach avoids changing
sdkKey()itself, which is also used for message-levelproviderOptionsremapping (line ~294) where"azure"is the correct key — as addressed in #20326.How did you verify your code works?
"options": { "reasoningEffort": "medium" }on a model"reasoning_effort": "medium"is now present in the bodydev(with fix: rm exclusion of ai-sdk/azure in transform.ts, when we migrated to v6 the ai sdk changed the key for ai-sdk/azure so the exclusion is no longer needed #20326 merged) that the issue still reproduces without this fixexternalOutputModeerror intui/app.tsxis pre-existing ondev)Checklist