Skip to content

Add support for nullable API keys and LM Studio#3186

Merged
Re-bin merged 3 commits intoHKUDS:mainfrom
sohamb117:feat/support_null_api_key
Apr 15, 2026
Merged

Add support for nullable API keys and LM Studio#3186
Re-bin merged 3 commits intoHKUDS:mainfrom
sohamb117:feat/support_null_api_key

Conversation

@sohamb117
Copy link
Copy Markdown
Contributor

Addresses #3185.

Adds support for nullable API keys and LM Studio as a first-class provider, improving the experience for local LLM users.

Key Changes

  • Nullable API keys: apiKey can now be set to null for local providers that don't require authentication (previously required dummy strings like "no-key" or "dummy")
  • LM Studio provider: Added lm_studio as a built-in provider with auto-detection via port 1234
  • Documentation updates: Updated examples for Custom Provider, vLLM, and added comprehensive LM Studio
    setup guide

Motivation

Local LLM servers (LM Studio, vLLM, llama.cpp, etc.) don't require API keys, but the previous schema required a non-empty string. This forced users to use dummy values like "dummy" or "no-key", which was confusing and inelegant. Additionally, LM Studio is a popular desktop app for running local models, and adding it as a first-class provider improves discoverability.

Changes

Core Changes

nanobot/config/schema.py:

  • Changed ProviderConfig.api_key type from str to str | None to support null values
  • Added lm_studio field to ProvidersConfig

nanobot/providers/registry.py:

  • Added lm_studio provider spec with:
    • Aliases: lm-studio, lmstudio, lm_studio
    • Default base URL: http://localhost:1234/v1
    • Auto-detection via port 1234
    • Backend: openai_compat

Documentation Updates

README.md:

  • Custom Provider: Updated guidance to recommend apiKey: null instead of dummy strings for local servers
  • vLLM: Updated example to use apiKey: null instead of "dummy"
  • LM Studio: Added comprehensive setup guide with:
    • Installation instructions
    • Server startup steps
    • Configuration examples
    • Usage notes
  • Provider table: Added LM Studio to the list of supported providers

Test Plan

  • Verify apiKey: null works with existing local providers (ollama, vllm, custom)
  • Verify LM Studio provider auto-detection with http://localhost:1234/v1 endpoint
  • Verify explicit provider: "lm_studio" configuration works
  • Verify backward compatibility: existing configs with non-null API keys still work
  • Test with actual LM Studio instance (if available)
  • Verify documentation examples are accurate

Breaking Changes

None. This is backward compatible:

  • Existing configs with string API keys continue to work
  • The change is purely additive (nullable type extends the existing string type)

@sohamb117 sohamb117 marked this pull request as ready for review April 15, 2026 17:19
Copy link
Copy Markdown
Collaborator

@Re-bin Re-bin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ;)

@Re-bin Re-bin merged commit 2b8e90d into HKUDS:main Apr 15, 2026
@sohamb117
Copy link
Copy Markdown
Contributor Author

Thanks! 9ACEC92B-A7C8-4492-91B6-7AF3A90BC777

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants