Bug Report: aws_role_name in proxy config YAML silently ignored — cross-account Bedrock AssumeRole never fires
Summary
When configuring aws_role_name in litellm_params via the proxy config YAML for cross-account Bedrock access, the parameter is silently dropped during optional params processing. The STS AssumeRole call never fires, and Bedrock calls land in the host account's IAM identity instead of the target role's account.
Use Case
We're a government organization (City and County of San Francisco) deploying LiteLLM as a centralized LLM Gateway on ECS Fargate. We have a central platform account running LiteLLM and multiple department accounts with Bedrock model access enabled. Each department has its own AWS account with a LiteLLMBedrockAccess IAM role that the LiteLLM ECS task role can assume.
Our config uses tag-based routing so each department's models route to their own Bedrock account for billing isolation:
model_list:
- model_name: claude-sonnet-4-6
litellm_params:
model: bedrock/us.anthropic.claude-sonnet-4-6
aws_region_name: us-west-2
aws_role_name: arn:aws:iam::244910800867:role/LiteLLMBedrockAccess
aws_session_name: litellm-dept-a
model_info:
tags:
- dept-a
- model_name: claude-sonnet-4-6
litellm_params:
model: bedrock/us.anthropic.claude-sonnet-4-6
aws_region_name: us-west-2
aws_role_name: arn:aws:iam::987654321012:role/LiteLLMBedrockAccess
aws_session_name: litellm-dept-b
model_info:
tags:
- dept-b
litellm_settings:
enable_tag_filtering: true
drop_params: true
This config format matches the LiteLLM documentation at https://docs.litellm.ai/docs/providers/bedrock (cross-account section).
Expected behavior: LiteLLM task role calls STS AssumeRole into the department role, then invokes Bedrock with the assumed credentials.
Actual behavior: AssumeRole never fires. Bedrock is called with the ECS task role's own credentials (central account), which doesn't have Bedrock model access enabled. Error: "Model use case details have not been submitted."
Root Cause (traced through source code)
The parameter flows correctly through the router but gets dropped inside get_optional_params() before reaching the Bedrock handler. Here is the exact code path:
1. Router passes aws_role_name correctly
litellm/router.py — The router spreads litellm_params from the deployment config into litellm.completion(**kwargs). At this point, aws_role_name is a top-level kwarg. ✅
2. completion() includes it in non-default params
litellm/main.py — get_non_default_completion_params(kwargs) returns aws_role_name because it is not in all_litellm_params or OPENAI_CHAT_COMPLETION_PARAMS. ✅
3. get_optional_params() receives it via **kwargs
litellm/utils.py:3934 — aws_role_name lands in **kwargs (the catch-all). Then:
passed_params = locals().copy()
special_params = passed_params.pop("kwargs") # aws_role_name is here
✅
4. base_pre_process_non_default_params() merges it, then filters it out
litellm/utils.py:3688-3723
Line 3688-3702: aws_role_name is correctly merged from special_params into passed_params (the aws_ prefix check correctly allows it for bedrock):
for k, v in special_params.items():
if k.startswith("aws_") and (
custom_llm_provider != "bedrock" # Correctly NOT skipped for bedrock
...
):
continue
passed_params[k] = v # aws_role_name merged into passed_params ✅
Line 3705-3723 — THE BUG: The comprehension that builds non_default_params requires k in default_param_values:
non_default_params = {
k: v
for k, v in passed_params.items()
if (
...
and k in default_param_values # ← DEFAULT_CHAT_COMPLETION_PARAM_VALUES
and v != default_param_values[k]
...
)
}
DEFAULT_CHAT_COMPLETION_PARAM_VALUES (litellm/constants.py:668) contains only OpenAI chat completion params (temperature, top_p, tools, etc.). aws_role_name is not in this dict, so it is filtered out. ❌
5. map_special_auth_params() doesn't help
litellm/utils.py:3851-3856 → litellm/llms/bedrock/common_utils.py:77-88
map_special_auth_params() only maps region_name → aws_region_name. It does not handle aws_role_name, aws_session_name, or any other AWS credential params. ❌
6. Handler gets None
litellm/llms/bedrock/chat/converse_handler.py:324:
aws_role_name = optional_params.pop("aws_role_name", None) # Returns None
get_credentials() is called with aws_role_name=None → no AssumeRole → ambient credentials used. ❌
Meanwhile, litellm_params has it
litellm/litellm_core_utils/get_litellm_params.py:160-162 correctly stores aws_role_name in litellm_params via _OPTIONAL_KWARGS_KEYS. The handler receives litellm_params as a parameter, but reads aws_role_name from optional_params instead. Nobody transfers it.
Reproduction
# config.yaml
model_list:
- model_name: test-model
litellm_params:
model: bedrock/us.anthropic.claude-sonnet-4-6
aws_region_name: us-west-2
aws_role_name: arn:aws:iam::TARGET_ACCOUNT:role/SomeRole
aws_session_name: litellm-test
litellm_settings:
drop_params: true
litellm --config config.yaml --detailed_debug
# Send a chat completion request
# Observe: no STS AssumeRole call in logs, Bedrock called with host credentials
Suggested Fix
Option A (minimal): In converse_handler.py:completion(), fall back to litellm_params when optional_params doesn't have AWS credential fields:
aws_role_name = optional_params.pop("aws_role_name", None) or litellm_params.get("aws_role_name")
aws_session_name = optional_params.pop("aws_session_name", None) or litellm_params.get("aws_session_name")
# ... same for other aws_ params
Option B (proper): Add aws_role_name, aws_session_name, aws_access_key_id, aws_secret_access_key, aws_session_token, aws_profile_name, aws_web_identity_token, aws_sts_endpoint, aws_external_id, and aws_bedrock_runtime_endpoint to DEFAULT_CHAT_COMPLETION_PARAM_VALUES in litellm/constants.py so they survive the filter in base_pre_process_non_default_params().
Option C: Expand map_special_auth_params() in bedrock/common_utils.py to copy all aws_* params from passed_params into optional_params, not just region_name.
Environment
- LiteLLM version:
main-v1.82.3-stable (also tested main-latest)
- Deployment: ECS Fargate with task role that has
sts:AssumeRole permission to target roles
- IAM trust policies verified (direct
aws sts assume-role works from the task role)
- Direct Bedrock invocation from target account works
Impact
This blocks any proxy deployment that uses per-model cross-account Bedrock routing via aws_role_name in the config YAML. The parameter is documented, accepted by the config parser, and stored correctly in litellm_params, but never reaches the handler that needs it.
Bug Report:
aws_role_namein proxy config YAML silently ignored — cross-account Bedrock AssumeRole never firesSummary
When configuring
aws_role_nameinlitellm_paramsvia the proxy config YAML for cross-account Bedrock access, the parameter is silently dropped during optional params processing. The STS AssumeRole call never fires, and Bedrock calls land in the host account's IAM identity instead of the target role's account.Use Case
We're a government organization (City and County of San Francisco) deploying LiteLLM as a centralized LLM Gateway on ECS Fargate. We have a central platform account running LiteLLM and multiple department accounts with Bedrock model access enabled. Each department has its own AWS account with a
LiteLLMBedrockAccessIAM role that the LiteLLM ECS task role can assume.Our config uses tag-based routing so each department's models route to their own Bedrock account for billing isolation:
This config format matches the LiteLLM documentation at https://docs.litellm.ai/docs/providers/bedrock (cross-account section).
Expected behavior: LiteLLM task role calls STS AssumeRole into the department role, then invokes Bedrock with the assumed credentials.
Actual behavior: AssumeRole never fires. Bedrock is called with the ECS task role's own credentials (central account), which doesn't have Bedrock model access enabled. Error: "Model use case details have not been submitted."
Root Cause (traced through source code)
The parameter flows correctly through the router but gets dropped inside
get_optional_params()before reaching the Bedrock handler. Here is the exact code path:1. Router passes
aws_role_namecorrectlylitellm/router.py— The router spreadslitellm_paramsfrom the deployment config intolitellm.completion(**kwargs). At this point,aws_role_nameis a top-level kwarg. ✅2.
completion()includes it in non-default paramslitellm/main.py—get_non_default_completion_params(kwargs)returnsaws_role_namebecause it is not inall_litellm_paramsorOPENAI_CHAT_COMPLETION_PARAMS. ✅3.
get_optional_params()receives it via**kwargslitellm/utils.py:3934—aws_role_namelands in**kwargs(the catch-all). Then:✅
4.
base_pre_process_non_default_params()merges it, then filters it outlitellm/utils.py:3688-3723Line 3688-3702:
aws_role_nameis correctly merged fromspecial_paramsintopassed_params(theaws_prefix check correctly allows it for bedrock):Line 3705-3723 — THE BUG: The comprehension that builds
non_default_paramsrequiresk in default_param_values:DEFAULT_CHAT_COMPLETION_PARAM_VALUES(litellm/constants.py:668) contains only OpenAI chat completion params (temperature, top_p, tools, etc.).aws_role_nameis not in this dict, so it is filtered out. ❌5.
map_special_auth_params()doesn't helplitellm/utils.py:3851-3856→litellm/llms/bedrock/common_utils.py:77-88map_special_auth_params()only mapsregion_name→aws_region_name. It does not handleaws_role_name,aws_session_name, or any other AWS credential params. ❌6. Handler gets
Nonelitellm/llms/bedrock/chat/converse_handler.py:324:get_credentials()is called withaws_role_name=None→ no AssumeRole → ambient credentials used. ❌Meanwhile,
litellm_paramshas itlitellm/litellm_core_utils/get_litellm_params.py:160-162correctly storesaws_role_nameinlitellm_paramsvia_OPTIONAL_KWARGS_KEYS. The handler receiveslitellm_paramsas a parameter, but readsaws_role_namefromoptional_paramsinstead. Nobody transfers it.Reproduction
Suggested Fix
Option A (minimal): In
converse_handler.py:completion(), fall back tolitellm_paramswhenoptional_paramsdoesn't have AWS credential fields:Option B (proper): Add
aws_role_name,aws_session_name,aws_access_key_id,aws_secret_access_key,aws_session_token,aws_profile_name,aws_web_identity_token,aws_sts_endpoint,aws_external_id, andaws_bedrock_runtime_endpointtoDEFAULT_CHAT_COMPLETION_PARAM_VALUESinlitellm/constants.pyso they survive the filter inbase_pre_process_non_default_params().Option C: Expand
map_special_auth_params()inbedrock/common_utils.pyto copy allaws_*params frompassed_paramsintooptional_params, not justregion_name.Environment
main-v1.82.3-stable(also testedmain-latest)sts:AssumeRolepermission to target rolesaws sts assume-roleworks from the task role)Impact
This blocks any proxy deployment that uses per-model cross-account Bedrock routing via
aws_role_namein the config YAML. The parameter is documented, accepted by the config parser, and stored correctly inlitellm_params, but never reaches the handler that needs it.