-
Notifications
You must be signed in to change notification settings - Fork 655
Expand file tree
/
Copy path.env.example
More file actions
78 lines (61 loc) · 3.25 KB
/
.env.example
File metadata and controls
78 lines (61 loc) · 3.25 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
# ============================================
# OpenSpace Environment Variables
# Copy this file to .env and fill in your keys
# ============================================
# ── LLM Credentials ──────────────────────────────────────
#
# OpenSpace resolves LLM credentials in this order (first match wins):
#
# 1. OPENSPACE_LLM_* — explicit override, always highest priority
# 2. Provider-native vars — OPENROUTER_API_KEY, OPENAI_API_KEY, etc.
# 3. ~/.nanobot/config.json or ~/.openclaw/openclaw.json — fallback (only when no explicit or provider key found)
#
# For most users, setting ONE of the provider-native keys below is enough.
# LiteLLM reads them automatically. See https://docs.litellm.ai/docs/providers
#
# Full configuration guide: openspace/config/README.md
# --- Option A: Provider-native key (simplest) ---
# Set the key that matches your model's provider:
# OpenRouter (for openrouter/* models, e.g. openrouter/anthropic/claude-sonnet-4.5)
OPENROUTER_API_KEY=
# Anthropic (for anthropic/claude-* models)
# ANTHROPIC_API_KEY=
# OpenAI (for openai/gpt-* models)
# OPENAI_API_KEY=
# DeepSeek (for deepseek/* models)
# DEEPSEEK_API_KEY=
# --- Option B: Explicit OpenSpace override (takes priority over Option A) ---
# Use these when you need full control, e.g. custom API base or non-standard provider.
# OPENSPACE_MODEL=openrouter/anthropic/claude-sonnet-4.5
# OPENSPACE_LLM_API_KEY=sk-xxx
# OPENSPACE_LLM_API_BASE=https://openrouter.ai/api/v1
# --- Option C: Local Ollama ---
# For ollama/* models, set OPENSPACE_MODEL and the local Ollama endpoint.
#
# OPENSPACE_MODEL=ollama/qwen3-coder:30b
# OLLAMA_API_BASE=http://127.0.0.1:11434
# OLLAMA_API_KEY=ollama
# ── OpenSpace Cloud (optional) ──────────────────────────────
# Register at https://open-space.cloud to get your key.
# Enables cloud skill search & upload; local features work without it.
OPENSPACE_API_KEY=sk_xxxxxxxxxxxxxxxx
# ── GUI Backend (optional) ──────────────────────────────────
# Required only if using the GUI backend (Anthropic Computer Use).
# Uses the same ANTHROPIC_API_KEY above.
# Optional backup key for rate limit fallback:
# ANTHROPIC_API_KEY_BACKUP=
# ── Embedding (optional) ────────────────────────────────────
# For remote embedding API instead of local model.
# If not set, OpenSpace uses a local embedding model (BAAI/bge-small-en-v1.5).
# EMBEDDING_BASE_URL=
# EMBEDDING_API_KEY=
# EMBEDDING_MODEL=openai/text-embedding-3-small
# ── E2B Sandbox (optional) ──────────────────────────────────
# Required only if sandbox mode is enabled in security config.
# E2B_API_KEY=
# ── Local Server (optional) ─────────────────────────────────
# Override the default local server URL (default: http://127.0.0.1:5000)
# Useful for remote VM integration (e.g., OSWorld).
# LOCAL_SERVER_URL=http://127.0.0.1:5000
# ---- Debug (Optional) ----
# OPENSPACE_DEBUG=true