Releases: BerriAI/litellm
v1.83.14-stable.patch.3
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14-stable.patch.3Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14-stable.patch.3/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14-stable.patch.3Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
Full Changelog: v1.83.14-stable.patch.2...v1.83.14-stable.patch.3
v1.83.14-stable.patch.2
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14-stable.patch.2Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14-stable.patch.2/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14-stable.patch.2Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
Full Changelog: v1.83.14-stable.patch.1...v1.83.14-stable.patch.2
v1.83.10-stable.patch.1
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.10-stable.patch.1Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.10-stable.patch.1/cosign.pub \
ghcr.io/berriai/litellm:v1.83.10-stable.patch.1Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
Full Changelog: v1.83.10-stable...v1.83.10-stable.patch.1
v1.84.0-rc.1
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.84.0-rc.1Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.84.0-rc.1/cosign.pub \
ghcr.io/berriai/litellm:v1.84.0-rc.1Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- merge main by @Sameerlite in #25521
- merge litellm_internal_staging by @Sameerlite in #25949
- merge main by @Sameerlite in #26304
- merge main by @Sameerlite in #26379
- merge main by @Sameerlite in #26437
- fix(redis): cache GCP IAM token to prevent async event loop blocking by @harish-berri in #26441
- litellm oss branch by @krrish-berri-2 in #26386
- fix noma v2 deepcopy crashing in build scan payload - new PR by @omriShukrun08 in #26605
- fix(ui): use stored-credentials endpoint for tools fetch on MCP edit page by @ryan-crabbe-berri in #26002
- feat(proxy): add --timeout_worker_healthcheck flag for uvicorn worker triage by @ryan-crabbe-berri in #26622
- fix(ci): support CircleCI rerun failed tests for local_testing jobs by @mateo-berri in #26461
- docs: update pull_request_template to add Linear ticket mentioning by @mateo-berri in #26655
- fix(pricing): GPT-5.5 Pro Pricing by @lmcdonald-godaddy in #26651
- feat(proxy): Add cleanup job for expired LiteLLM dashboard session keys by @milan-berri in #26460
- fix(ui): move 'Store Prompts in Spend Logs' toggle to Admin Settings by @ryan-crabbe-berri in #26631
- fix(caching): preserve prompt_tokens_details through embedding cache round-trip by @michelligabriele in #26653
- feat(logging): add retry settings for generic API logger by @milan-berri in #26645
- fix(logging): backfill streaming hidden response cost by @milan-berri in #26606
- fix(vertex-ai): reuse anthropic messages config instances by @Sameerlite in #26099
- fix(vertex): preserve items on array branches in anyOf with null + de-flake test by @yuneng-berri in #26675
- fix(tests): replace deprecated Bedrock Claude 3.7 Sonnet model ID by @ryan-crabbe-berri in #26721
- [Fix] Cache LiteLLM_Config param reads in DualCache and batch by @Michael-RZ-Berri in #26469
- [Feat] Lazy-load optional feature routers on first request by @Michael-RZ-Berri in #26534
- [Fix] Unify cost calc in success_handler dict and typed branches by @Michael-RZ-Berri in #26629
- Revert "[Feat] Lazy-load optional feature routers on first request" by @krrish-berri-2 in #26727
- [Infra] Version Bump by @yuneng-berri in #26728
- [Infra] Promote Internal Staging to main by @yuneng-berri in #26731
- ci(release): accept PEP 440 tag forms in create-release workflow by @yuneng-berri in #26734
- [Feat] Add gpt-image-2 support (#26644) by @ishaan-berri in #26705
- feat(provider): add AIHubMix as an OpenAI-compatible provider by @xinrui-z in #24294
- merge internal staging by @Sameerlite in #26737
- merge main by @Sameerlite in #26742
- merge mian by @Sameerlite in #26745
- fix(test): scope ERROR log assertion to LiteLLM logger in test_model_alias_map by @mateo-berri in #26741
- merge main by @Sameerlite in #26757
- fix(bedrock, anthropic): translate OpenAI file content on tool-result path by @minznerjosh in #26710
- remove /ui/chat page by @ishaan-berri in #26739
- fix: add optional TCP SO_KEEPALIVE support to aiohttp's TCPConnector by @yassinkortam in #26730
- feat(proxy): LiteLLM headers on Google native generateContent routes by @Sameerlite in #25500
- feat(vector-stores): support Bedrock retrievalConfiguration passthrough by @Sameerlite in #26685
- feat(mcp): opt-in short-ID tool prefix to keep MCP tool names under the 60-char limit by @mateo-berri in #26733
- fix(proxy): self-heal Prisma read paths + harden reconnect state machine by @yuneng-berri in #26756
- [Fix] Redact spend logs error message by @Michael-RZ-Berri in #26662
- [Feat]Add support for azure entra discovery endpoint by @Sameerlite in #26584
- [Fix] Proxy: reconnect Prisma DB without blocking the event loop by @yuneng-berri in #26225
- feat(proxy): durable agent workflow run tracking via /v1/workflows/runs by @ishaan-berri in #26793
- chore(auth): tighten clientside api_base handling by @stuxf in #26518
- fix: drop sensitive locals from re-raised error messages by @ryan-crabbe-berri in #26823
- test(vertex-batches): set is_redirect=False on mocked retrieve response by @yuneng-berri in #26844
- chore(vector-stores): redact credentials in list/info/update responses; gate update by per-store access by @stuxf in #26489
- chore(auth): substitute alias for master key on UserAPIKeyAuth by @stuxf in #26484
- fix(mcp): tighten public-route detection and OAuth2 fallback gating by @stuxf in #26463
- [Fix] Team member null budget fallback by @Michael-RZ-Berri in #26809
- fix(proxy): inherit caller identity in passthrough batch managed-object by @ryan-crabbe-berri in #26831
- fix(proxy/auth): tighten guardrail modification permission check by @ryan-crabbe-berri in #26821
- [Fix] CI/Tooling: Correct min-release-age value in .npmrc files by @yuneng-berri in #26850
- fix(bedrock): add 1-hour cache write pricing for Claude 4.5/4.6/4.7 (Global, US) by @mateo-berri in #26800
- feat(proxy): add team-level search provider credentials by @Sameerlite in #26691
- Litellm oss staging by @Sameerlite in #26759
- fix(proxy/batches): forward model to retrieve_batch for bedrock by @sruthi-sixt-26 in #26814
- fix(passthrough): track spend for interrupted Bedrock streams by @mateo-berri in #26719
- merge main by @Sameerlite in #26855
- Litellm oss staging by @Sameerlite in #26852
- chore(team): audit-log team-callback admin mutations by @stuxf in #26859
- chore(team): close authz bypass via the available-team check by @stuxf in #26854
- chore(auth): harden invite-link onboarding token flow by @stuxf in #26843
- chore(mcp): tighten OAuth root endpoint resolution by @stuxf in #26840
- fix: validate aws region name by @yassin-berriai in #26906
- fix: drop milvus dbName and partitionNames from MILVUS_OPTIONAL_PARAMS by @yassin-berriai in #26910
- [Feat / Fix] Lazy loaded imports, lazy loaded front page by @Michael-RZ-Berri in #26802
- chore(proxy): harden request control fields by @stuxf in #26862
- chore(proxy): block env callback refs in key metadata by @stuxf in #26851
- chore(mcp): encrypt user-scoped MCP credentials at rest by @stuxf in https://github.com/BerriAI/li...
v1.83.14-stable.patch.1
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14-stable.patch.1Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14-stable.patch.1/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14-stable.patch.1Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
Full Changelog: v1.83.14-stable...v1.83.14-stable.patch.1
1.84.0-dev.2
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:1.84.0-dev.2Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/1.84.0-dev.2/cosign.pub \
ghcr.io/berriai/litellm:1.84.0-dev.2Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- merge main by @Sameerlite in #25521
- merge litellm_internal_staging by @Sameerlite in #25949
- merge main by @Sameerlite in #26304
- merge main by @Sameerlite in #26379
- merge main by @Sameerlite in #26437
- ci(release): accept PEP 440 tag forms in create-release workflow by @yuneng-berri in #26734
- [Feat] Add gpt-image-2 support (#26644) by @ishaan-berri in #26705
- feat(provider): add AIHubMix as an OpenAI-compatible provider by @xinrui-z in #24294
- merge internal staging by @Sameerlite in #26737
- merge main by @Sameerlite in #26742
- merge mian by @Sameerlite in #26745
- fix(test): scope ERROR log assertion to LiteLLM logger in test_model_alias_map by @mateo-berri in #26741
- merge main by @Sameerlite in #26757
- fix(bedrock, anthropic): translate OpenAI file content on tool-result path by @minznerjosh in #26710
- remove /ui/chat page by @ishaan-berri in #26739
- fix: add optional TCP SO_KEEPALIVE support to aiohttp's TCPConnector by @yassinkortam in #26730
- feat(proxy): LiteLLM headers on Google native generateContent routes by @Sameerlite in #25500
- feat(vector-stores): support Bedrock retrievalConfiguration passthrough by @Sameerlite in #26685
- feat(mcp): opt-in short-ID tool prefix to keep MCP tool names under the 60-char limit by @mateo-berri in #26733
- fix(proxy): self-heal Prisma read paths + harden reconnect state machine by @yuneng-berri in #26756
- [Fix] Redact spend logs error message by @Michael-RZ-Berri in #26662
- [Feat]Add support for azure entra discovery endpoint by @Sameerlite in #26584
- [Fix] Proxy: reconnect Prisma DB without blocking the event loop by @yuneng-berri in #26225
- feat(proxy): durable agent workflow run tracking via /v1/workflows/runs by @ishaan-berri in #26793
- chore(auth): tighten clientside api_base handling by @stuxf in #26518
- fix: drop sensitive locals from re-raised error messages by @ryan-crabbe-berri in #26823
- test(vertex-batches): set is_redirect=False on mocked retrieve response by @yuneng-berri in #26844
- chore(vector-stores): redact credentials in list/info/update responses; gate update by per-store access by @stuxf in #26489
- chore(auth): substitute alias for master key on UserAPIKeyAuth by @stuxf in #26484
- fix(mcp): tighten public-route detection and OAuth2 fallback gating by @stuxf in #26463
- [Fix] Team member null budget fallback by @Michael-RZ-Berri in #26809
- fix(proxy): inherit caller identity in passthrough batch managed-object by @ryan-crabbe-berri in #26831
- fix(proxy/auth): tighten guardrail modification permission check by @ryan-crabbe-berri in #26821
- [Fix] CI/Tooling: Correct min-release-age value in .npmrc files by @yuneng-berri in #26850
- fix(bedrock): add 1-hour cache write pricing for Claude 4.5/4.6/4.7 (Global, US) by @mateo-berri in #26800
- feat(proxy): add team-level search provider credentials by @Sameerlite in #26691
- Litellm oss staging by @Sameerlite in #26759
- fix(proxy/batches): forward model to retrieve_batch for bedrock by @sruthi-sixt-26 in #26814
- fix(passthrough): track spend for interrupted Bedrock streams by @mateo-berri in #26719
- merge main by @Sameerlite in #26855
- Litellm oss staging by @Sameerlite in #26852
- chore(team): audit-log team-callback admin mutations by @stuxf in #26859
- chore(team): close authz bypass via the available-team check by @stuxf in #26854
- chore(auth): harden invite-link onboarding token flow by @stuxf in #26843
- chore(mcp): tighten OAuth root endpoint resolution by @stuxf in #26840
- fix: validate aws region name by @yassin-berriai in #26906
- fix: drop milvus dbName and partitionNames from MILVUS_OPTIONAL_PARAMS by @yassin-berriai in #26910
- [Feat / Fix] Lazy loaded imports, lazy loaded front page by @Michael-RZ-Berri in #26802
- chore(proxy): harden request control fields by @stuxf in #26862
- chore(proxy): block env callback refs in key metadata by @stuxf in #26851
- chore(mcp): encrypt user-scoped MCP credentials at rest by @stuxf in #26836
- chore(mcp): SSRF guard on OAuth metadata discovery follow-up fetches by @stuxf in #26849
- [Fix] Replace subprocess startup-import diff with static source scan by @Michael-RZ-Berri in #26934
- chore(passthrough): default auth=True and drop enterprise gate on the safe option by @stuxf in #26827
- chore(proxy): contain UI_LOGO_PATH / LITELLM_FAVICON_URL on unauthenticated asset endpoints by @stuxf in #26815
- chore(cli): tighten CLI SSO session flow by @stuxf in #26835
- [Test] Proxy E2E: Opt In To Client Mock Response For Model Access Tests by @yuneng-berri in #26941
- [Fix] Responses API: Omit Empty Body On DELETE by @yuneng-berri in #26949
- Run pre_call_hook on Google generateContent endpoints by @Michael-RZ-Berri in #26914
- [Fix] Refresh Redis TTL on counter writes, skip stale in-memory in Redis by @Michael-RZ-Berri in #26829
- Add pagination controls to model health status by @shivamrawat1 in #26826
- feat(vertex_ai): propagate metadata labels to embedding, Imagen, rerank by @Sameerlite in #25499
- fix(anthropic): json response_format + user tools non-streaming by @Sameerlite in #26222
- [Infra] Bump Versions by @yuneng-berri in #26961
- [Infra] Promote Internal Staging to main by @yuneng-berri in #26962
New Contributors
- @xinrui-z made their first contribution in #24294
- @minznerjosh made their first contribution in #26710
- @yassinkortam made their first contribution in #26730
- @sruthi-sixt-26 made their first contribution in #26814
- @yassin-berriai made their first contribution in #26906
Full Changelog: 1.84.0-dev.1...1.84.0-dev.2
1.84.0-dev.1
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:1.84.0-dev.1Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/1.84.0-dev.1/cosign.pub \
ghcr.io/berriai/litellm:1.84.0-dev.1Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- fix(redis): cache GCP IAM token to prevent async event loop blocking by @harish-berri in #26441
- litellm oss branch by @krrish-berri-2 in #26386
- fix noma v2 deepcopy crashing in build scan payload - new PR by @omriShukrun08 in #26605
- fix(ui): use stored-credentials endpoint for tools fetch on MCP edit page by @ryan-crabbe-berri in #26002
- feat(proxy): add --timeout_worker_healthcheck flag for uvicorn worker triage by @ryan-crabbe-berri in #26622
- fix(ci): support CircleCI rerun failed tests for local_testing jobs by @mateo-berri in #26461
- docs: update pull_request_template to add Linear ticket mentioning by @mateo-berri in #26655
- fix(pricing): GPT-5.5 Pro Pricing by @lmcdonald-godaddy in #26651
- feat(proxy): Add cleanup job for expired LiteLLM dashboard session keys by @milan-berri in #26460
- fix(ui): move 'Store Prompts in Spend Logs' toggle to Admin Settings by @ryan-crabbe-berri in #26631
- fix(caching): preserve prompt_tokens_details through embedding cache round-trip by @michelligabriele in #26653
- feat(logging): add retry settings for generic API logger by @milan-berri in #26645
- fix(logging): backfill streaming hidden response cost by @milan-berri in #26606
- fix(vertex-ai): reuse anthropic messages config instances by @Sameerlite in #26099
- fix(vertex): preserve items on array branches in anyOf with null + de-flake test by @yuneng-berri in #26675
- fix(tests): replace deprecated Bedrock Claude 3.7 Sonnet model ID by @ryan-crabbe-berri in #26721
- [Fix] Cache LiteLLM_Config param reads in DualCache and batch by @Michael-RZ-Berri in #26469
- [Feat] Lazy-load optional feature routers on first request by @Michael-RZ-Berri in #26534
- [Fix] Unify cost calc in success_handler dict and typed branches by @Michael-RZ-Berri in #26629
- Revert "[Feat] Lazy-load optional feature routers on first request" by @krrish-berri-2 in #26727
- [Infra] Version Bump by @yuneng-berri in #26728
- [Infra] Promote Internal Staging to main by @yuneng-berri in #26731
New Contributors
- @omriShukrun08 made their first contribution in #26605
- @lmcdonald-godaddy made their first contribution in #26651
Full Changelog: v1.83.14.rc.1...1.84.0-dev.1
v1.83.14.rc.1
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14.rc.1Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14.rc.1/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14.rc.1Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- fix: preserve tool_use input args in Anthropic adapter streaming by @Chesars in #24355
- fix: preserve role='assistant' in Azure streaming with include_usage by @Chesars in #24354
- fix: map Zhipu GLM non-standard finish_reason values by @Chesars in #24373
- fix(responses-api): apply GPT-5 temperature validation by @Chesars in #24371
- fix(bedrock): sort assistant content blocks so text precedes toolUse by @Chesars in #24368
- fix(gemini): filter params from embedding requests by @Chesars in #24370
- fix(gemini): read web search cost from model_info instead of hardcode by @Chesars in #24372
- fix(gemini): include DOCUMENT modality tokens in cost calculation by @Chesars in #24410
- docs: add missing observability integrations to View All page by @Chesars in #24420
- fix(vertex_ai): forward dimensions parameter in multimodalembedding requests by @Chesars in #24415
- refactor(responses): extract shared format mapping between Responses API and Chat Completions bridges by @Chesars in #24417
- fix(model-prices): migrate 38 models from legacy max_tokens to max_input_tokens/max_output_tokens by @Chesars in #24422
- feat(bedrock): add GLM-5 and Minimax M2.5 with regional aliases by @Chesars in #24423
- fix: update bedrock claude sonnet/opus 4.6 above 200k token pricing and sonnet 4.6 max_input_tokens to 1M by @dongyu-turo in #24164
- merge litellm_internal_staging by @Sameerlite in #25942
- merge litellm_internal_staging by @Sameerlite in #25945
- Sync litellm_staging_03_22_2026 with litellm_internal_staging by @Chesars in #26283
- merge main by @Sameerlite in #26301
- merge main by @Sameerlite in #26303
- fix(router): restore BYOK key injection for vector store endpoints with team-scoped deployments by @shivamrawat1 in #25746
- [Infra] Remove CCI/GHA test duplication and semantically shard proxy DB tests by @yuneng-berri in #26356
- merge main by @Sameerlite in #26381
- Split MCP routes into inference vs management (unblock Admin UI on DISABLE_LLM_API_ENDPOINTS nodes) by @ryan-crabbe-berri in #26367
- feat(responses): add use_chat_completions_api flag for openai/ models with custom api_base by @Sameerlite in #25346
- fix(team_endpoints): auto-add SSO team members to org on move (proxy admin only) by @ishaan-berri in #26377
- sync litellm_staging_03_22_2026 with litellm_internal_staging by @Chesars in #26440
- fix(proxy): respect object-level permissions for managed vector store endpoints by @shivamrawat1 in #26351
- feat(pricing): gemini-embedding-2 GA cost map, blog, and test by @Sameerlite in #26391
- fix(responses): normalize bridged object field by @Sameerlite in #26327
- feat(models): add versioned GPT-5.4 mini/nano snapshots by @Sameerlite in #26115
- fix(proxy): preserve anthropic_messages call type for /v1/messages logging by @Sameerlite in #26248
- feat(responses): strip custom_tool_call namespace for all providers by @Sameerlite in #26221
- fix(anthropic): strip Gemini thought suffix from streaming tool_use id by @Sameerlite in #25935
- feat(docs): align fenced code block padding on blog and doc pages by @Sameerlite in #25932
- docs(gemini): Gemini 3 thinking_level defaults and release note by @Sameerlite in #25842
- docs(proxy): clarify x-litellm-model-group vs provider model id by @Sameerlite in #25497
- [Fix] Tests - Proxy: Isolate master_key/prisma_client module globals between tests by @yuneng-berri in #26362
- feat(openai): add route_all_chat_openai_to_responses global flag by @Sameerlite in #25359
- Litellm staging 03 22 2026 by @Chesars in #24374
- chore(packaging): declare MIT license in litellm-proxy-extras metadata by @stuxf in #26369
- chore(deps): bump vulnerable dependencies by @stuxf in #26365
- fix(auth): centralize common_checks to close authorization bypass by @stuxf in #26279
- fix(mcp): harden OAuth authorize/token endpoints (BYOK + discoverable) by @stuxf in #26274
- [Feat] Day-0 support for GPT-5.5 and GPT-5.5 Pro by @mateo-berri in #26449
- [Infra] Remove docs/my-website, point contributors to litellm-docs repo by @yuneng-berri in #26454
- fix(vertex passthrough): log :embedContent and :batchEmbedContents responses by @ishaan-berri in #26146
- fix(jwt-auth): apply team TPM/RPM + attribution for admins using x-litellm-team-id by @ryan-crabbe-berri in #26438
- [Infra] Declare proprietary license in litellm-enterprise metadata by @yuneng-berri in #26457
- feat(guardrails): LLM-as-a-Judge guardrail by @ishaan-berri in #26360
- [Fix] Guardrail param handling in list and submission endpoints by @yuneng-berri in #26390
- [Feature] UI - Users: Add Send Invitation Email Toggle by @yuneng-berri in #25808
- [Refactor] Proxy: move projects management to enterprise package by @yuneng-berri in #25677
- fix(proxy): single-team DB fallback when JWT has no team_id by @milan-berri in #26418
- [Fix] Harden team metadata handling in /team/new and /team/update by @yuneng-berri in #26464
- [Feat] Add azure/gpt-5.5 + azure/gpt-5.5-pro entries (+ dated variants) by @mateo-berri in #26361
- feat(proxy): add /v1/memory CRUD endpoints by @krrish-berri-2 in #26218
- [Fix] Harden pass-through target URL construction by @yuneng-berri in #26467
- [Fix] Tighten caller-permission checks on key route fields by @yuneng-berri in #26492
- [Fix] Extend caller-permission checks to service-account + tighten raw-body acceptance by @yuneng-berri in #26493
- feat: UI setting to disable /key/generate for org admins by @ryan-crabbe-berri in #26442
- fix(ui): stop injecting $0 cost on model edit by @ryan-crabbe-berri in #26001
- fix: preserve service_account_id in metadata on /key/update by @ryan-crabbe-berri in #26004
- [Feature] UI - Spend Logs: sortable Model and TTFT columns by @yuneng-berri in #26488
- [Fix] Restrict /global/spend/* routes to admin roles by @yuneng-berri in #26490
- [Infra] Merge dev branch by @yuneng-berri in #26496
- Sync litellm_staging_03_23_2026 with litellm_internal_staging by @Chesars in #26510
- Litellm staging 03 23 2026 by @Chesars in #24428
- ci: add supply-chain guard to block fork PRs that modify dependencies by @krrish-berri-2 in #26511
- [Fix] Align MCP OAuth proxy endpoints with per-server acces...
v1.83.14-stable
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14-stableVerify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14-stable/cosign.pub \
ghcr.io/berriai/litellm:v1.83.14-stableExpected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- fix: preserve tool_use input args in Anthropic adapter streaming by @Chesars in #24355
- fix: preserve role='assistant' in Azure streaming with include_usage by @Chesars in #24354
- fix: map Zhipu GLM non-standard finish_reason values by @Chesars in #24373
- fix(responses-api): apply GPT-5 temperature validation by @Chesars in #24371
- fix(bedrock): sort assistant content blocks so text precedes toolUse by @Chesars in #24368
- fix(gemini): filter params from embedding requests by @Chesars in #24370
- fix(gemini): read web search cost from model_info instead of hardcode by @Chesars in #24372
- fix(gemini): include DOCUMENT modality tokens in cost calculation by @Chesars in #24410
- docs: add missing observability integrations to View All page by @Chesars in #24420
- fix(vertex_ai): forward dimensions parameter in multimodalembedding requests by @Chesars in #24415
- refactor(responses): extract shared format mapping between Responses API and Chat Completions bridges by @Chesars in #24417
- fix(model-prices): migrate 38 models from legacy max_tokens to max_input_tokens/max_output_tokens by @Chesars in #24422
- feat(bedrock): add GLM-5 and Minimax M2.5 with regional aliases by @Chesars in #24423
- fix: update bedrock claude sonnet/opus 4.6 above 200k token pricing and sonnet 4.6 max_input_tokens to 1M by @dongyu-turo in #24164
- merge litellm_internal_staging by @Sameerlite in #25942
- merge litellm_internal_staging by @Sameerlite in #25945
- Sync litellm_staging_03_22_2026 with litellm_internal_staging by @Chesars in #26283
- merge main by @Sameerlite in #26301
- merge main by @Sameerlite in #26303
- fix(router): restore BYOK key injection for vector store endpoints with team-scoped deployments by @shivamrawat1 in #25746
- [Infra] Remove CCI/GHA test duplication and semantically shard proxy DB tests by @yuneng-berri in #26356
- merge main by @Sameerlite in #26381
- Split MCP routes into inference vs management (unblock Admin UI on DISABLE_LLM_API_ENDPOINTS nodes) by @ryan-crabbe-berri in #26367
- feat(responses): add use_chat_completions_api flag for openai/ models with custom api_base by @Sameerlite in #25346
- fix(team_endpoints): auto-add SSO team members to org on move (proxy admin only) by @ishaan-berri in #26377
- sync litellm_staging_03_22_2026 with litellm_internal_staging by @Chesars in #26440
- fix(proxy): respect object-level permissions for managed vector store endpoints by @shivamrawat1 in #26351
- feat(pricing): gemini-embedding-2 GA cost map, blog, and test by @Sameerlite in #26391
- fix(responses): normalize bridged object field by @Sameerlite in #26327
- feat(models): add versioned GPT-5.4 mini/nano snapshots by @Sameerlite in #26115
- fix(proxy): preserve anthropic_messages call type for /v1/messages logging by @Sameerlite in #26248
- feat(responses): strip custom_tool_call namespace for all providers by @Sameerlite in #26221
- fix(anthropic): strip Gemini thought suffix from streaming tool_use id by @Sameerlite in #25935
- feat(docs): align fenced code block padding on blog and doc pages by @Sameerlite in #25932
- docs(gemini): Gemini 3 thinking_level defaults and release note by @Sameerlite in #25842
- docs(proxy): clarify x-litellm-model-group vs provider model id by @Sameerlite in #25497
- [Fix] Tests - Proxy: Isolate master_key/prisma_client module globals between tests by @yuneng-berri in #26362
- feat(openai): add route_all_chat_openai_to_responses global flag by @Sameerlite in #25359
- Litellm staging 03 22 2026 by @Chesars in #24374
- chore(packaging): declare MIT license in litellm-proxy-extras metadata by @stuxf in #26369
- chore(deps): bump vulnerable dependencies by @stuxf in #26365
- fix(auth): centralize common_checks to close authorization bypass by @stuxf in #26279
- fix(mcp): harden OAuth authorize/token endpoints (BYOK + discoverable) by @stuxf in #26274
- [Feat] Day-0 support for GPT-5.5 and GPT-5.5 Pro by @mateo-berri in #26449
- [Infra] Remove docs/my-website, point contributors to litellm-docs repo by @yuneng-berri in #26454
- fix(vertex passthrough): log :embedContent and :batchEmbedContents responses by @ishaan-berri in #26146
- fix(jwt-auth): apply team TPM/RPM + attribution for admins using x-litellm-team-id by @ryan-crabbe-berri in #26438
- [Infra] Declare proprietary license in litellm-enterprise metadata by @yuneng-berri in #26457
- feat(guardrails): LLM-as-a-Judge guardrail by @ishaan-berri in #26360
- [Fix] Guardrail param handling in list and submission endpoints by @yuneng-berri in #26390
- [Feature] UI - Users: Add Send Invitation Email Toggle by @yuneng-berri in #25808
- [Refactor] Proxy: move projects management to enterprise package by @yuneng-berri in #25677
- fix(proxy): single-team DB fallback when JWT has no team_id by @milan-berri in #26418
- [Fix] Harden team metadata handling in /team/new and /team/update by @yuneng-berri in #26464
- [Feat] Add azure/gpt-5.5 + azure/gpt-5.5-pro entries (+ dated variants) by @mateo-berri in #26361
- feat(proxy): add /v1/memory CRUD endpoints by @krrish-berri-2 in #26218
- [Fix] Harden pass-through target URL construction by @yuneng-berri in #26467
- [Fix] Tighten caller-permission checks on key route fields by @yuneng-berri in #26492
- [Fix] Extend caller-permission checks to service-account + tighten raw-body acceptance by @yuneng-berri in #26493
- feat: UI setting to disable /key/generate for org admins by @ryan-crabbe-berri in #26442
- fix(ui): stop injecting $0 cost on model edit by @ryan-crabbe-berri in #26001
- fix: preserve service_account_id in metadata on /key/update by @ryan-crabbe-berri in #26004
- [Feature] UI - Spend Logs: sortable Model and TTFT columns by @yuneng-berri in #26488
- [Fix] Restrict /global/spend/* routes to admin roles by @yuneng-berri in #26490
- [Infra] Merge dev branch by @yuneng-berri in #26496
- Sync litellm_staging_03_23_2026 with litellm_internal_staging by @Chesars in #26510
- Litellm staging 03 23 2026 by @Chesars in #24428
- ci: add supply-chain guard to block fork PRs that modify dependencies by @krrish-berri-2 in #26511
- [Fix] Align MCP OAuth proxy endpoints with per-server...
v1.83.13-nightly
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.13-nightlyVerify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.13-nightly/cosign.pub \
ghcr.io/berriai/litellm:v1.83.13-nightlyExpected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- merge main by @Sameerlite in #26258
- [Fix] Align image URL fetch with validated HTTP client in Bedrock and token counter paths by @yuneng-berri in #26272
- [Fix] Extend request body parameter restrictions to cloud provider auth fields by @yuneng-berri in #26264
- [Fix] Enforce format constraints on provider URL parameters by @yuneng-berri in #26287
- fix(mcp_semantic_tool_filter): match tools with client-side namespace prefix (#26078) by @sakenuGOD in #26117
- fix(adapter): normalize reasoning effort with graceful degradation by @Vigilans in #26111
- fix(anthropic): skip non-OpenAI file content blocks in file-id discovery helpers by @anmolg1997 in #26228
- feat(messages): map reasoning_auto_summary to thinking.display for native /v1/messages by @Vigilans in #25883
- fix(ovhcloud): Fix tool calling not working by @eliasto in #25948
- fix(anthropic): handle tool_choice type 'none' in messages API by @BillionClaw in #24457
- fix(ui): Fetch button ignores active filters on Request Logs page by @Bytechoreographer in #25788
- fix(ui): stale filters applied after sort/page/time change on Request… by @Bytechoreographer in #25789
- refactor: replace substring check with startswith in is_model_gpt_5_model by @BraulioV in #25793
- Feat(dashscope): add image generation support for qwen-image-2.0 and qwen-image-2.0-pro by @Alpha-Zark in #25672
- fix(image_edit): forward litellm_params to validate_environment for Vertex AI credentials by @Sameerlite in #26160
- feat: Expand VideoMetadata support to all Gemini Models. by @vinhphamhuu-ct in #25767
- Litellm oss staging 04 22 2026 by @krrish-berri-2 in #26300
- [Infra] CCI: cache, cleanup, anchors, install-path parity, Python 3.12, Ruby/Node pins by @yuneng-berri in #26286
- [Fix] Image edit endpoints: enforce multipart-only file inputs by @yuneng-berri in #26293
- [IInfra] Merge dev branch by @yuneng-berri in #26336
- feat: add gpt-5.5 to model cost map by @mateo-berri in #26345
- [Infra] Add standalone create-release-branch workflow by @yuneng-berri in #26342
- feat: add gpt-5.5 to model cost map by @mateo-berri in #26348
- Fix bugs that bypasses per-team member budget limit by @Michael-RZ-Berri in #26204
- [Fix] Tests - drain logging worker in test_router_caching_ttl to fix flakiness by @yuneng-berri in #26355
- feat(vertex_ai): multi-region Vertex hosts (aiplatform.*.rep.googleapis.com) by @milan-berri in #26281
- [Fix] Infra: grant contents:write to create-release-branch caller job by @yuneng-berri in #26359
- [Fix] Deflake spend tracking tests by @yuneng-berri in #26349
- fix(proxy): share temporary MCP OAuth sessions across instances via Redis by @milan-berri in #26318
- [Fix] Reset budget windows failing due to Prisma Json? null filter by @yuneng-berri in #26346
- Surface per-member budget cycle in Teams > Members tab by @ryan-crabbe-berri in #26207
- [Infra] Bump version 1.83.12 → 1.83.13 by @yuneng-berri in #26370
- [Infra] Promote internal staging to main by @yuneng-berri in #26375
New Contributors
- @sakenuGOD made their first contribution in #26117
- @Vigilans made their first contribution in #26111
- @anmolg1997 made their first contribution in #26228
- @Bytechoreographer made their first contribution in #25788
- @BraulioV made their first contribution in #25793
- @Alpha-Zark made their first contribution in #25672
- @vinhphamhuu-ct made their first contribution in #25767
Full Changelog: v1.83.12-nightly...v1.83.13-nightly