Skip to content

feat: Ollama cloud model support (keyless auth)#316

Merged
qhkm merged 3 commits intomainfrom
feat/ollama-keyless-auth
Mar 11, 2026
Merged

feat: Ollama cloud model support (keyless auth)#316
qhkm merged 3 commits intomainfrom
feat/ollama-keyless-auth

Conversation

@qhkm
Copy link
Copy Markdown
Owner

@qhkm qhkm commented Mar 11, 2026

Summary

  • Allow Ollama and vLLM providers to resolve without requiring a dummy API key
  • Skip Authorization header when no API key is configured (local instances don't need auth)
  • When api_key IS set, auth works normally (cloud Ollama with auth)
  • Update configured_provider_names and configured_provider_models to include keyless providers

Closes #284

Changes

  • src/providers/registry.rs: Add api_key_required field to ProviderSpec, keyless fallback in resolve_credential
  • src/providers/openai.rs: Conditional auth header in chat(), chat_stream(), embed()
  • src/channels/model_switch.rs: Update Ollama comment label
  • CLAUDE.md: Document keyless provider configuration

Config examples

{"providers": {"ollama": {}}}
{"providers": {"ollama": {"api_base": "https://my-cloud-ollama.example.com/v1"}}}
{"providers": {"ollama": {"api_key": "secret", "api_base": "https://my-cloud-ollama.example.com/v1"}}}

Test plan

  • 6 new registry tests (keyless resolution, bare config, backward compat, configured_provider_names/models)
  • 2 new OpenAI provider tests (auth header skip/send)
  • Full lib suite: 3100 passed, 0 failed
  • Doc tests: 127 passed
  • clippy clean, fmt clean

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features

    • Support for keyless provider configuration—Ollama and vLLM can now be used without requiring an API key.
  • Documentation

    • Added comprehensive guide for configuring providers without API keys.
    • Clarified that Ollama supports both local and cloud deployments.

qhkm and others added 3 commits March 11, 2026 10:23
Add `api_key_required: bool` to `ProviderSpec` and set it to `false`
for ollama and vllm. When `api_key_required` is false and the provider
section is present in config but has no key, `resolve_credential` now
returns an empty-string credential instead of skipping the provider.

All other providers retain `api_key_required: true` (no behavior change).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Allows keyless providers (Ollama, vLLM) to make requests
without sending a spurious auth header.
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 11, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: f607694d-dcb5-4fa4-8bed-dfc21d97412c

📥 Commits

Reviewing files that changed from the base of the PR and between e7a3ea6 and 6af0969.

📒 Files selected for processing (4)
  • CLAUDE.md
  • src/channels/model_switch.rs
  • src/providers/openai.rs
  • src/providers/registry.rs

📝 Walkthrough

Walkthrough

This PR introduces keyless provider support by adding an api_key_required boolean field to provider specifications, modifying credential resolution to handle API-key-absent scenarios, updating OpenAI's auth header logic to conditionally attach authorization headers, and adjusting provider selection filtering to include keyless providers without requiring API keys.

Changes

Cohort / File(s) Summary
Documentation
CLAUDE.md
Added "Keyless Providers" section documenting how Ollama and vLLM can be configured without API keys, including provider configuration examples and clarification that no Authorization header is sent when api_key is absent.
Provider Registry & Credential Resolution
src/providers/registry.rs
Introduced api_key_required: bool field to ProviderSpec; updated provider selection filtering to include keyless providers conditionally; changed resolve_credential signature from provider_name: &str to spec: &ProviderSpec; added keyless fallback logic returning empty-string ApiKey credential when no key is resolved but provider doesn't require one and config exists; extended tests for keyless provider resolution.
OpenAI Authentication
src/providers/openai.rs
Added guard in auth_header_pair to return ("", "") for empty API keys; updated request construction in chat, chat_stream, and embeddings to conditionally attach Authorization header only when header name is non-empty; added tests validating empty-key and nonempty-key scenarios.
Provider Metadata
src/channels/model_switch.rs
Updated Ollama provider comment from "(local)" to "(local or cloud)" to reflect cloud model support.

Sequence Diagram

sequenceDiagram
    participant Client
    participant Registry as Provider Registry
    participant CredResolver as Credential Resolver
    participant OpenAI as OpenAI Provider
    participant Request as HTTP Request

    alt Keyless Provider (Ollama/vLLM)
        Client->>Registry: Select provider (api_key_required=false)
        Registry->>CredResolver: resolve_credential(spec)
        CredResolver->>CredResolver: Check api_key_required=false
        CredResolver-->>Client: Return ApiKey("")
        Client->>OpenAI: Build request with empty key
        OpenAI->>OpenAI: auth_header_pair("") → ("", "")
        OpenAI->>Request: Skip Authorization header
    else Key-Required Provider
        Client->>Registry: Select provider (api_key_required=true)
        Registry->>CredResolver: resolve_credential(spec)
        CredResolver->>CredResolver: Resolve API key from config
        CredResolver-->>Client: Return ApiKey("sk-...")
        Client->>OpenAI: Build request with key
        OpenAI->>OpenAI: auth_header_pair("sk-...") → ("Authorization", "Bearer sk-...")
        OpenAI->>Request: Attach Authorization header
    end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Poem

🐰 Hopping through configs both near and far,
No keys needed—Ollama, a star!
Empty strings dance where Bearer once stood,
Keyless providers, misunderstood!
Cloud and local, together at last—
Configuration's future, unsurpassed!

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main change: enabling keyless authentication for Ollama providers, which directly addresses the PR's primary objective of cloud Ollama support.
Linked Issues check ✅ Passed The PR implements the objective from issue #284 by enabling cloud Ollama configuration with keyless authentication, allowing providers to work with or without API keys as required.
Out of Scope Changes check ✅ Passed All changes directly support keyless provider authentication: registry changes add api_key_required flag, openai.rs conditionally sends auth headers, model_switch.rs updates Ollama label, and CLAUDE.md documents the feature.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feat/ollama-keyless-auth

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@qhkm qhkm merged commit 991c257 into main Mar 11, 2026
9 checks passed
@qhkm qhkm deleted the feat/ollama-keyless-auth branch March 11, 2026 11:02
taqtiqa-mark pushed a commit to taqtiqa-mark/zeptoclaw that referenced this pull request Mar 25, 2026
## Summary

- Allow Ollama and vLLM providers to resolve without requiring a dummy
API key
- Skip `Authorization` header when no API key is configured (local
instances don't need auth)
- When `api_key` IS set, auth works normally (cloud Ollama with auth)
- Update `configured_provider_names` and `configured_provider_models` to
include keyless providers

Closes qhkm#284

## Changes

- `src/providers/registry.rs`: Add `api_key_required` field to
`ProviderSpec`, keyless fallback in `resolve_credential`
- `src/providers/openai.rs`: Conditional auth header in `chat()`,
`chat_stream()`, `embed()`
- `src/channels/model_switch.rs`: Update Ollama comment label
- `CLAUDE.md`: Document keyless provider configuration

## Config examples

```json
{"providers": {"ollama": {}}}
{"providers": {"ollama": {"api_base": "https://my-cloud-ollama.example.com/v1"}}}
{"providers": {"ollama": {"api_key": "secret", "api_base": "https://my-cloud-ollama.example.com/v1"}}}
```

## Test plan

- [x] 6 new registry tests (keyless resolution, bare config, backward
compat, configured_provider_names/models)
- [x] 2 new OpenAI provider tests (auth header skip/send)
- [x] Full lib suite: 3100 passed, 0 failed
- [x] Doc tests: 127 passed
- [x] clippy clean, fmt clean

🤖 Generated with [Claude Code](https://claude.com/claude-code)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **New Features**
* Support for keyless provider configuration—Ollama and vLLM can now be
used without requiring an API key.

* **Documentation**
* Added comprehensive guide for configuring providers without API keys.
  * Clarified that Ollama supports both local and cloud deployments.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ollama cloud model support

1 participant