Skip to content

Releases: BerriAI/litellm

v1.83.14-stable.patch.3

07 May 23:42
cd34090

Choose a tag to compare

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14-stable.patch.3

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14-stable.patch.3/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14-stable.patch.3

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

Full Changelog: v1.83.14-stable.patch.2...v1.83.14-stable.patch.3

v1.83.14-stable.patch.2

06 May 02:37
b36fb1d

Choose a tag to compare

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14-stable.patch.2

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14-stable.patch.2/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14-stable.patch.2

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

Full Changelog: v1.83.14-stable.patch.1...v1.83.14-stable.patch.2

v1.83.10-stable.patch.1

06 May 02:53
04291b4

Choose a tag to compare

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.10-stable.patch.1

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.10-stable.patch.1/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.10-stable.patch.1

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

Full Changelog: v1.83.10-stable...v1.83.10-stable.patch.1

v1.84.0-rc.1

05 May 23:46
6ff668c

Choose a tag to compare

v1.84.0-rc.1 Pre-release
Pre-release

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:v1.84.0-rc.1

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/v1.84.0-rc.1/cosign.pub \
  ghcr.io/berriai/litellm:v1.84.0-rc.1

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

What's Changed

Read more

v1.83.14-stable.patch.1

04 May 17:51
93d8375

Choose a tag to compare

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14-stable.patch.1

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14-stable.patch.1/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14-stable.patch.1

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

Full Changelog: v1.83.14-stable...v1.83.14-stable.patch.1

1.84.0-dev.2

01 May 06:43
934ecdc

Choose a tag to compare

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:1.84.0-dev.2

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/1.84.0-dev.2/cosign.pub \
  ghcr.io/berriai/litellm:1.84.0-dev.2

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

What's Changed

New Contributors

Full Changelog: 1.84.0-dev.1...1.84.0-dev.2

1.84.0-dev.1

29 Apr 02:47
3e1479c

Choose a tag to compare

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:1.84.0-dev.1

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/1.84.0-dev.1/cosign.pub \
  ghcr.io/berriai/litellm:1.84.0-dev.1

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

What's Changed

New Contributors

Full Changelog: v1.83.14.rc.1...1.84.0-dev.1

v1.83.14.rc.1

27 Apr 17:01
3d2b8fe

Choose a tag to compare

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14.rc.1

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14.rc.1/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14.rc.1

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

What's Changed

  • fix: preserve tool_use input args in Anthropic adapter streaming by @Chesars in #24355
  • fix: preserve role='assistant' in Azure streaming with include_usage by @Chesars in #24354
  • fix: map Zhipu GLM non-standard finish_reason values by @Chesars in #24373
  • fix(responses-api): apply GPT-5 temperature validation by @Chesars in #24371
  • fix(bedrock): sort assistant content blocks so text precedes toolUse by @Chesars in #24368
  • fix(gemini): filter params from embedding requests by @Chesars in #24370
  • fix(gemini): read web search cost from model_info instead of hardcode by @Chesars in #24372
  • fix(gemini): include DOCUMENT modality tokens in cost calculation by @Chesars in #24410
  • docs: add missing observability integrations to View All page by @Chesars in #24420
  • fix(vertex_ai): forward dimensions parameter in multimodalembedding requests by @Chesars in #24415
  • refactor(responses): extract shared format mapping between Responses API and Chat Completions bridges by @Chesars in #24417
  • fix(model-prices): migrate 38 models from legacy max_tokens to max_input_tokens/max_output_tokens by @Chesars in #24422
  • feat(bedrock): add GLM-5 and Minimax M2.5 with regional aliases by @Chesars in #24423
  • fix: update bedrock claude sonnet/opus 4.6 above 200k token pricing and sonnet 4.6 max_input_tokens to 1M by @dongyu-turo in #24164
  • merge litellm_internal_staging by @Sameerlite in #25942
  • merge litellm_internal_staging by @Sameerlite in #25945
  • Sync litellm_staging_03_22_2026 with litellm_internal_staging by @Chesars in #26283
  • merge main by @Sameerlite in #26301
  • merge main by @Sameerlite in #26303
  • fix(router): restore BYOK key injection for vector store endpoints with team-scoped deployments by @shivamrawat1 in #25746
  • [Infra] Remove CCI/GHA test duplication and semantically shard proxy DB tests by @yuneng-berri in #26356
  • merge main by @Sameerlite in #26381
  • Split MCP routes into inference vs management (unblock Admin UI on DISABLE_LLM_API_ENDPOINTS nodes) by @ryan-crabbe-berri in #26367
  • feat(responses): add use_chat_completions_api flag for openai/ models with custom api_base by @Sameerlite in #25346
  • fix(team_endpoints): auto-add SSO team members to org on move (proxy admin only) by @ishaan-berri in #26377
  • sync litellm_staging_03_22_2026 with litellm_internal_staging by @Chesars in #26440
  • fix(proxy): respect object-level permissions for managed vector store endpoints by @shivamrawat1 in #26351
  • feat(pricing): gemini-embedding-2 GA cost map, blog, and test by @Sameerlite in #26391
  • fix(responses): normalize bridged object field by @Sameerlite in #26327
  • feat(models): add versioned GPT-5.4 mini/nano snapshots by @Sameerlite in #26115
  • fix(proxy): preserve anthropic_messages call type for /v1/messages logging by @Sameerlite in #26248
  • feat(responses): strip custom_tool_call namespace for all providers by @Sameerlite in #26221
  • fix(anthropic): strip Gemini thought suffix from streaming tool_use id by @Sameerlite in #25935
  • feat(docs): align fenced code block padding on blog and doc pages by @Sameerlite in #25932
  • docs(gemini): Gemini 3 thinking_level defaults and release note by @Sameerlite in #25842
  • docs(proxy): clarify x-litellm-model-group vs provider model id by @Sameerlite in #25497
  • [Fix] Tests - Proxy: Isolate master_key/prisma_client module globals between tests by @yuneng-berri in #26362
  • feat(openai): add route_all_chat_openai_to_responses global flag by @Sameerlite in #25359
  • Litellm staging 03 22 2026 by @Chesars in #24374
  • chore(packaging): declare MIT license in litellm-proxy-extras metadata by @stuxf in #26369
  • chore(deps): bump vulnerable dependencies by @stuxf in #26365
  • fix(auth): centralize common_checks to close authorization bypass by @stuxf in #26279
  • fix(mcp): harden OAuth authorize/token endpoints (BYOK + discoverable) by @stuxf in #26274
  • [Feat] Day-0 support for GPT-5.5 and GPT-5.5 Pro by @mateo-berri in #26449
  • [Infra] Remove docs/my-website, point contributors to litellm-docs repo by @yuneng-berri in #26454
  • fix(vertex passthrough): log :embedContent and :batchEmbedContents responses by @ishaan-berri in #26146
  • fix(jwt-auth): apply team TPM/RPM + attribution for admins using x-litellm-team-id by @ryan-crabbe-berri in #26438
  • [Infra] Declare proprietary license in litellm-enterprise metadata by @yuneng-berri in #26457
  • feat(guardrails): LLM-as-a-Judge guardrail by @ishaan-berri in #26360
  • [Fix] Guardrail param handling in list and submission endpoints by @yuneng-berri in #26390
  • [Feature] UI - Users: Add Send Invitation Email Toggle by @yuneng-berri in #25808
  • [Refactor] Proxy: move projects management to enterprise package by @yuneng-berri in #25677
  • fix(proxy): single-team DB fallback when JWT has no team_id by @milan-berri in #26418
  • [Fix] Harden team metadata handling in /team/new and /team/update by @yuneng-berri in #26464
  • [Feat] Add azure/gpt-5.5 + azure/gpt-5.5-pro entries (+ dated variants) by @mateo-berri in #26361
  • feat(proxy): add /v1/memory CRUD endpoints by @krrish-berri-2 in #26218
  • [Fix] Harden pass-through target URL construction by @yuneng-berri in #26467
  • [Fix] Tighten caller-permission checks on key route fields by @yuneng-berri in #26492
  • [Fix] Extend caller-permission checks to service-account + tighten raw-body acceptance by @yuneng-berri in #26493
  • feat: UI setting to disable /key/generate for org admins by @ryan-crabbe-berri in #26442
  • fix(ui): stop injecting $0 cost on model edit by @ryan-crabbe-berri in #26001
  • fix: preserve service_account_id in metadata on /key/update by @ryan-crabbe-berri in #26004
  • [Feature] UI - Spend Logs: sortable Model and TTFT columns by @yuneng-berri in #26488
  • [Fix] Restrict /global/spend/* routes to admin roles by @yuneng-berri in #26490
  • [Infra] Merge dev branch by @yuneng-berri in #26496
  • Sync litellm_staging_03_23_2026 with litellm_internal_staging by @Chesars in #26510
  • Litellm staging 03 23 2026 by @Chesars in #24428
  • ci: add supply-chain guard to block fork PRs that modify dependencies by @krrish-berri-2 in #26511
  • [Fix] Align MCP OAuth proxy endpoints with per-server acces...
Read more

v1.83.14-stable

02 May 04:00
3d2b8fe

Choose a tag to compare

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14-stable

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.14-stable/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.14-stable

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

What's Changed

  • fix: preserve tool_use input args in Anthropic adapter streaming by @Chesars in #24355
  • fix: preserve role='assistant' in Azure streaming with include_usage by @Chesars in #24354
  • fix: map Zhipu GLM non-standard finish_reason values by @Chesars in #24373
  • fix(responses-api): apply GPT-5 temperature validation by @Chesars in #24371
  • fix(bedrock): sort assistant content blocks so text precedes toolUse by @Chesars in #24368
  • fix(gemini): filter params from embedding requests by @Chesars in #24370
  • fix(gemini): read web search cost from model_info instead of hardcode by @Chesars in #24372
  • fix(gemini): include DOCUMENT modality tokens in cost calculation by @Chesars in #24410
  • docs: add missing observability integrations to View All page by @Chesars in #24420
  • fix(vertex_ai): forward dimensions parameter in multimodalembedding requests by @Chesars in #24415
  • refactor(responses): extract shared format mapping between Responses API and Chat Completions bridges by @Chesars in #24417
  • fix(model-prices): migrate 38 models from legacy max_tokens to max_input_tokens/max_output_tokens by @Chesars in #24422
  • feat(bedrock): add GLM-5 and Minimax M2.5 with regional aliases by @Chesars in #24423
  • fix: update bedrock claude sonnet/opus 4.6 above 200k token pricing and sonnet 4.6 max_input_tokens to 1M by @dongyu-turo in #24164
  • merge litellm_internal_staging by @Sameerlite in #25942
  • merge litellm_internal_staging by @Sameerlite in #25945
  • Sync litellm_staging_03_22_2026 with litellm_internal_staging by @Chesars in #26283
  • merge main by @Sameerlite in #26301
  • merge main by @Sameerlite in #26303
  • fix(router): restore BYOK key injection for vector store endpoints with team-scoped deployments by @shivamrawat1 in #25746
  • [Infra] Remove CCI/GHA test duplication and semantically shard proxy DB tests by @yuneng-berri in #26356
  • merge main by @Sameerlite in #26381
  • Split MCP routes into inference vs management (unblock Admin UI on DISABLE_LLM_API_ENDPOINTS nodes) by @ryan-crabbe-berri in #26367
  • feat(responses): add use_chat_completions_api flag for openai/ models with custom api_base by @Sameerlite in #25346
  • fix(team_endpoints): auto-add SSO team members to org on move (proxy admin only) by @ishaan-berri in #26377
  • sync litellm_staging_03_22_2026 with litellm_internal_staging by @Chesars in #26440
  • fix(proxy): respect object-level permissions for managed vector store endpoints by @shivamrawat1 in #26351
  • feat(pricing): gemini-embedding-2 GA cost map, blog, and test by @Sameerlite in #26391
  • fix(responses): normalize bridged object field by @Sameerlite in #26327
  • feat(models): add versioned GPT-5.4 mini/nano snapshots by @Sameerlite in #26115
  • fix(proxy): preserve anthropic_messages call type for /v1/messages logging by @Sameerlite in #26248
  • feat(responses): strip custom_tool_call namespace for all providers by @Sameerlite in #26221
  • fix(anthropic): strip Gemini thought suffix from streaming tool_use id by @Sameerlite in #25935
  • feat(docs): align fenced code block padding on blog and doc pages by @Sameerlite in #25932
  • docs(gemini): Gemini 3 thinking_level defaults and release note by @Sameerlite in #25842
  • docs(proxy): clarify x-litellm-model-group vs provider model id by @Sameerlite in #25497
  • [Fix] Tests - Proxy: Isolate master_key/prisma_client module globals between tests by @yuneng-berri in #26362
  • feat(openai): add route_all_chat_openai_to_responses global flag by @Sameerlite in #25359
  • Litellm staging 03 22 2026 by @Chesars in #24374
  • chore(packaging): declare MIT license in litellm-proxy-extras metadata by @stuxf in #26369
  • chore(deps): bump vulnerable dependencies by @stuxf in #26365
  • fix(auth): centralize common_checks to close authorization bypass by @stuxf in #26279
  • fix(mcp): harden OAuth authorize/token endpoints (BYOK + discoverable) by @stuxf in #26274
  • [Feat] Day-0 support for GPT-5.5 and GPT-5.5 Pro by @mateo-berri in #26449
  • [Infra] Remove docs/my-website, point contributors to litellm-docs repo by @yuneng-berri in #26454
  • fix(vertex passthrough): log :embedContent and :batchEmbedContents responses by @ishaan-berri in #26146
  • fix(jwt-auth): apply team TPM/RPM + attribution for admins using x-litellm-team-id by @ryan-crabbe-berri in #26438
  • [Infra] Declare proprietary license in litellm-enterprise metadata by @yuneng-berri in #26457
  • feat(guardrails): LLM-as-a-Judge guardrail by @ishaan-berri in #26360
  • [Fix] Guardrail param handling in list and submission endpoints by @yuneng-berri in #26390
  • [Feature] UI - Users: Add Send Invitation Email Toggle by @yuneng-berri in #25808
  • [Refactor] Proxy: move projects management to enterprise package by @yuneng-berri in #25677
  • fix(proxy): single-team DB fallback when JWT has no team_id by @milan-berri in #26418
  • [Fix] Harden team metadata handling in /team/new and /team/update by @yuneng-berri in #26464
  • [Feat] Add azure/gpt-5.5 + azure/gpt-5.5-pro entries (+ dated variants) by @mateo-berri in #26361
  • feat(proxy): add /v1/memory CRUD endpoints by @krrish-berri-2 in #26218
  • [Fix] Harden pass-through target URL construction by @yuneng-berri in #26467
  • [Fix] Tighten caller-permission checks on key route fields by @yuneng-berri in #26492
  • [Fix] Extend caller-permission checks to service-account + tighten raw-body acceptance by @yuneng-berri in #26493
  • feat: UI setting to disable /key/generate for org admins by @ryan-crabbe-berri in #26442
  • fix(ui): stop injecting $0 cost on model edit by @ryan-crabbe-berri in #26001
  • fix: preserve service_account_id in metadata on /key/update by @ryan-crabbe-berri in #26004
  • [Feature] UI - Spend Logs: sortable Model and TTFT columns by @yuneng-berri in #26488
  • [Fix] Restrict /global/spend/* routes to admin roles by @yuneng-berri in #26490
  • [Infra] Merge dev branch by @yuneng-berri in #26496
  • Sync litellm_staging_03_23_2026 with litellm_internal_staging by @Chesars in #26510
  • Litellm staging 03 23 2026 by @Chesars in #24428
  • ci: add supply-chain guard to block fork PRs that modify dependencies by @krrish-berri-2 in #26511
  • [Fix] Align MCP OAuth proxy endpoints with per-server...
Read more

v1.83.13-nightly

24 Apr 05:47
7b47dff

Choose a tag to compare

Verify Docker Image Signature

All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.

Verify using the pinned commit hash (recommended):

A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.13-nightly

Verify using the release tag (convenience):

Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:

cosign verify \
  --key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.13-nightly/cosign.pub \
  ghcr.io/berriai/litellm:v1.83.13-nightly

Expected output:

The following checks were performed on each of these signatures:
  - The cosign claims were validated
  - The signatures were verified against the specified public key

What's Changed

New Contributors

Full Changelog: v1.83.12-nightly...v1.83.13-nightly