Git provider
Github Cloud
System Info
Describe the bug
I am currently using pr_agent with Claude (Anthropic) as my LLM. Starting about 2 days ago, I began encountering a litellm.AuthenticationError stating invalid x-api-key during LLM inference.
To Reproduce
Run pr_agent configured to use Anthropic (Claude).
The error occurs during the chat_completion phase in litellm_ai_handler.py.
Expected behavior
Successful LLM inference using the Anthropic API without authentication errors.
Logs / Error Message
Here is the error log I'm getting:
{"text": "Error during LLM inference: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"},"request_id":"req_011CZRJKw2HYvZXX8mSK2LHr"}\n", "record": {"elapsed": {"repr": "0:00:25.615212", "seconds": 25.615212}, "exception": null, "extra": {}, "file": {"name": "litellm_ai_handler.py", "path": "/app/pr_agent/algo/ai_handlers/litellm_ai_handler.py"}, "function": "chat_completion", "level": {"icon": "⚠️", "name": "WARNING", "no": 30}, "line": 418, "message": "Error during LLM inference: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"},"request_id":"req_011CZRJKw2HYvZXX8mSK2LHr"}", "module": "litellm_ai_handler", "name": "pr_agent.algo.ai_handlers.litellm_ai_handler", "process": {"id": 7, "name": "MainProcess"}, "thread": {"id": 140704083827584, "name": "MainThread"}, "time": {"repr": "2026-03-26 07:30:07.944313+00:00", "timestamp": 1774510207.944313}}}
Additional context
LLM Provider: Anthropic (Claude)
This issue started happening ~2 days ago. It was working fine previously.
Bug details
{"text": "Error during LLM inference: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"},"request_id":"req_011CZRJKw2HYvZXX8mSK2LHr"}\n", "record": {"elapsed": {"repr": "0:00:25.615212", "seconds": 25.615212}, "exception": null, "extra": {}, "file": {"name": "litellm_ai_handler.py", "path": "/app/pr_agent/algo/ai_handlers/litellm_ai_handler.py"}, "function": "chat_completion", "level": {"icon": "⚠️", "name": "WARNING", "no": 30}, "line": 418, "message": "Error during LLM inference: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"},"request_id":"req_011CZRJKw2HYvZXX8mSK2LHr"}", "module": "litellm_ai_handler", "name": "pr_agent.algo.ai_handlers.litellm_ai_handler", "process": {"id": 7, "name": "MainProcess"}, "thread": {"id": 140704083827584, "name": "MainThread"}, "time": {"repr": "2026-03-26 07:30:07.944313+00:00", "timestamp": 1774510207.944313}}}
Git provider
Github Cloud
System Info
Describe the bug
I am currently using pr_agent with Claude (Anthropic) as my LLM. Starting about 2 days ago, I began encountering a litellm.AuthenticationError stating invalid x-api-key during LLM inference.
To Reproduce
Run pr_agent configured to use Anthropic (Claude).
The error occurs during the chat_completion phase in litellm_ai_handler.py.
Expected behavior
Successful LLM inference using the Anthropic API without authentication errors.
Logs / Error Message
Here is the error log I'm getting:
{"text": "Error during LLM inference: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"},"request_id":"req_011CZRJKw2HYvZXX8mSK2LHr"}\n", "record": {"elapsed": {"repr": "0:00:25.615212", "seconds": 25.615212}, "exception": null, "extra": {}, "file": {"name": "litellm_ai_handler.py", "path": "/app/pr_agent/algo/ai_handlers/litellm_ai_handler.py"}, "function": "chat_completion", "level": {"icon": "⚠️ ", "name": "WARNING", "no": 30}, "line": 418, "message": "Error during LLM inference: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"},"request_id":"req_011CZRJKw2HYvZXX8mSK2LHr"}", "module": "litellm_ai_handler", "name": "pr_agent.algo.ai_handlers.litellm_ai_handler", "process": {"id": 7, "name": "MainProcess"}, "thread": {"id": 140704083827584, "name": "MainThread"}, "time": {"repr": "2026-03-26 07:30:07.944313+00:00", "timestamp": 1774510207.944313}}}
Additional context
LLM Provider: Anthropic (Claude)
This issue started happening ~2 days ago. It was working fine previously.
Bug details
{"text": "Error during LLM inference: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"},"request_id":"req_011CZRJKw2HYvZXX8mSK2LHr"}\n", "record": {"elapsed": {"repr": "0:00:25.615212", "seconds": 25.615212}, "exception": null, "extra": {}, "file": {"name": "litellm_ai_handler.py", "path": "/app/pr_agent/algo/ai_handlers/litellm_ai_handler.py"}, "function": "chat_completion", "level": {"icon": "⚠️ ", "name": "WARNING", "no": 30}, "line": 418, "message": "Error during LLM inference: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"},"request_id":"req_011CZRJKw2HYvZXX8mSK2LHr"}", "module": "litellm_ai_handler", "name": "pr_agent.algo.ai_handlers.litellm_ai_handler", "process": {"id": 7, "name": "MainProcess"}, "thread": {"id": 140704083827584, "name": "MainThread"}, "time": {"repr": "2026-03-26 07:30:07.944313+00:00", "timestamp": 1774510207.944313}}}