Skip to content

feat(ollama): Add API key support for Ollama Cloud#2278

Merged
naorpeled merged 1 commit intoqodo-ai:mainfrom
arynyklas:feat/add-ollama-api-key-support
Mar 24, 2026
Merged

feat(ollama): Add API key support for Ollama Cloud#2278
naorpeled merged 1 commit intoqodo-ai:mainfrom
arynyklas:feat/add-ollama-api-key-support

Conversation

@arynyklas
Copy link
Copy Markdown
Contributor

@arynyklas arynyklas commented Mar 18, 2026

Add API key support for Ollama Cloud authentication

This change enables PR-Agent to authenticate with Ollama Cloud (ollama.com)
by adding support for the OLLAMA.API_KEY configuration option. Previously,
only local Ollama instances without authentication were supported.

The API key is now passed through to litellm completion calls when configured,
allowing users to leverage hosted Ollama models that require authentication.

Also updates the secrets template documentation to clarify the distinction
between Ollama Cloud and local Ollama deployments.

Fixes #2267

Add support for OLLAMA.API_KEY configuration to enable authentication
with Ollama Cloud (ollama.com). Previously only local Ollama instances
were supported without authentication.

- Pass api_key to litellm completion calls when configured
- Update secrets template with documentation for the new api_key field
- Clarify api_base comment to distinguish between Ollama Cloud and local
@arynyklas arynyklas marked this pull request as ready for review March 18, 2026 06:16
@qodo-free-for-open-source-projects
Copy link
Copy Markdown
Contributor

Review Summary by Qodo

Add API key support for Ollama Cloud authentication

✨ Enhancement

Grey Divider

Walkthroughs

Description
• Add API key support for Ollama Cloud authentication
• Pass configured api_key to litellm completion calls
• Update secrets template with api_key documentation
• Clarify api_base distinction between cloud and local
Diagram
flowchart LR
  A["OLLAMA.API_KEY config"] -- "read from settings" --> B["litellm_ai_handler"]
  B -- "set litellm.api_key" --> C["litellm initialization"]
  B -- "pass api_key in kwargs" --> D["chat_completion call"]
  D -- "authenticate with" --> E["Ollama Cloud"]
Loading

Grey Divider

File Changes

1. pr_agent/algo/ai_handlers/litellm_ai_handler.py ✨ Enhancement +4/-0

Add Ollama API key configuration and passing

• Add check for OLLAMA.API_KEY configuration and set it on litellm object
• Pass api_key from litellm to completion kwargs in chat_completion method
• Enable authentication with Ollama Cloud hosted models

pr_agent/algo/ai_handlers/litellm_ai_handler.py


2. pr_agent/settings/.secrets_template.toml 📝 Documentation +2/-1

Update Ollama configuration template documentation

• Add new api_key field under [ollama] section
• Update api_base comment to clarify Ollama Cloud vs local deployment
• Document that api_key is required for Ollama Cloud only

pr_agent/settings/.secrets_template.toml


Grey Divider

Qodo Logo

@qodo-free-for-open-source-projects
Copy link
Copy Markdown
Contributor

qodo-free-for-open-source-projects bot commented Mar 18, 2026

Code Review by Qodo

🐞 Bugs (1) 📘 Rule violations (2) 📎 Requirement gaps (0)

Grey Divider


Action required

1. kwargs['api_key'] set unconditionally 📘 Rule violation ⛯ Reliability
Description
The PR unconditionally injects kwargs["api_key"] = litellm.api_key without validating/normalizing
the key or checking relevance to the selected provider/model. This can pass None/empty keys or
override provider-specific auth unexpectedly, causing avoidable runtime failures.
Code

pr_agent/algo/ai_handlers/litellm_ai_handler.py[R409-410]

+            kwargs["api_key"] = litellm.api_key
+
Evidence
Rule 18 requires early normalization/validation of user-controlled inputs (like API keys) and safe
defaults/errors; the new code always forwards whatever is currently in litellm.api_key (including
None/empty) without checks. This also increases the chance of errors rather than handling edge
cases gracefully as required by Rule 3.

Rule 3: Robust Error Handling
pr_agent/algo/ai_handlers/litellm_ai_handler.py[409-412]
Best Practice: Learned patterns

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`chat_completion()` always sets `kwargs["api_key"] = litellm.api_key` without validating the value or checking if it should be applied for the selected provider/model. This can forward `None`/empty keys or override provider-specific auth unexpectedly.

## Issue Context
The PR introduces `OLLAMA.API_KEY` support, but the forwarding behavior should be defensive: normalize/validate user-controlled inputs early and use safe defaults or explicit errors.

## Fix Focus Areas
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[403-413]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


2. Ollama key overwrites globals 🐞 Bug ⛯ Reliability
Description
__init__() writes OLLAMA.API_KEY into the global litellm.api_key, overwriting values
previously set for other providers (e.g., Groq/XAI/OpenRouter/Azure AD token). In multi-provider
configurations (e.g., using fallback_models), this can cause subsequent non-Ollama calls to
authenticate with the wrong key.
Code

pr_agent/algo/ai_handlers/litellm_ai_handler.py[R85-86]

+        if get_settings().get("OLLAMA.API_KEY", None):
+            litellm.api_key = get_settings().ollama.api_key
Evidence
Within the same initializer, multiple providers assign to the same global litellm.api_key; the new
Ollama assignment happens after Groq/XAI assignments and will overwrite them. Since the system
supports fallback models, a single process may need multiple provider credentials to coexist without
clobbering each other.

pr_agent/algo/ai_handlers/litellm_ai_handler.py[71-86]
pr_agent/algo/ai_handlers/litellm_ai_handler.py[115-133]
pr_agent/settings/configuration.toml[5-10]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
`OLLAMA.API_KEY` currently overwrites the shared global `litellm.api_key` during handler initialization. This global is also used by other providers and by other configuration branches (Groq, XAI, OpenRouter, Azure AD token), so it is not safe for multi-provider/fallback setups.

### Issue Context
The codebase supports `fallback_models`, which can mix providers; credentials should not clobber each other via shared global state.

### Fix Focus Areas
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[67-92]
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[115-139]
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[398-413]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools



Remediation recommended

3. ollama.api_key non-defensive access 📘 Rule violation ⛯ Reliability
Description
The PR dereferences get_settings().ollama.api_key after checking
get_settings().get("OLLAMA.API_KEY"), which can still be fragile if nested attributes are missing
or not shaped as expected. Prefer using the same defensive access path (get() with defaults) and
validate required fields before use.
Code

pr_agent/algo/ai_handlers/litellm_ai_handler.py[R85-86]

+        if get_settings().get("OLLAMA.API_KEY", None):
+            litellm.api_key = get_settings().ollama.api_key
Evidence
Rule 17 requires defensive access for optional/nested configuration to avoid AttributeError/KeyError
and to validate required fields before use; the added code uses a nested attribute dereference
(.ollama.api_key) rather than continuing to use .get(...) with a default. This also relates to
Rule 18’s requirement to validate/normalize user-controlled inputs (API keys) early.

pr_agent/algo/ai_handlers/litellm_ai_handler.py[85-86]
Best Practice: Learned patterns
Best Practice: Learned patterns

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
The new Ollama API key assignment uses `get_settings().ollama.api_key` (nested attribute access) after a `.get("OLLAMA.API_KEY")` check. This is less defensive than continuing to use `.get()` and can be fragile if configuration nesting differs.

## Issue Context
Compliance requires defensive access for optional/nested configuration and early validation/normalization of user-controlled inputs like API keys.

## Fix Focus Areas
- pr_agent/algo/ai_handlers/litellm_ai_handler.py[82-87]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


Grey Divider

ⓘ The new review experience is currently in Beta. Learn more

Grey Divider

Qodo Logo

Copy link
Copy Markdown
Collaborator

@naorpeled naorpeled left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for this 🙏

Once my comment is addressed, will gladly merge this

@arynyklas
Copy link
Copy Markdown
Contributor Author

LGTM! Thanks for this 🙏

Once my comment is addressed, will gladly merge this

didn't you forget?

@naorpeled
Copy link
Copy Markdown
Collaborator

LGTM! Thanks for this 🙏
Once my comment is addressed, will gladly merge this

didn't you forget?

Sorry, missed your reply.
Merging now, thanks!

@naorpeled naorpeled merged commit 5c0a4c9 into qodo-ai:main Mar 24, 2026
2 checks passed
shine911 pushed a commit to shine911/pr-agent that referenced this pull request Mar 25, 2026
shine911 added a commit to shine911/pr-agent that referenced this pull request Mar 25, 2026
Revert "feat(ollama): Add API key support for Ollama Cloud (qodo-ai#2278)"
shine911 pushed a commit to shine911/pr-agent that referenced this pull request Mar 25, 2026
- It make gemini model was failed when call api
Refs: qodo-ai#2278
shine911 added a commit to shine911/pr-agent that referenced this pull request Mar 25, 2026
- It make gemini model was failed when call api
Refs: qodo-ai#2278
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support for Ollama Cloud API

2 participants