AI-native memory with dialectic reasoning for OpenClaw. Uses Honcho's peer paradigm to build and maintain separate models of the user and the agent — enabling context-aware conversations that improve over time. No local infrastructure required.
This plugin uses OpenClaw's slot system (kind: "memory") to replace the built-in memory plugins (memory-core, memory-lancedb). During setup, existing memory files can be migrated to Honcho. Workspace docs (SOUL.md, AGENTS.md, BOOTSTRAP.md) can be updated manually to reference Honcho's tools instead of the old file-based system.
openclaw plugins install @honcho-ai/openclaw-honcho
openclaw honcho setup
openclaw gateway restartopenclaw honcho setup prompts for your Honcho API key, writes the config, and optionally uploads any legacy memory files to Honcho.
Alternative: ClawHub Skill
Use the honcho-setup skill to run migration interactively from within a chat session:
# 1. Install the skill
npx clawhub install honcho-setup
# 2. Restart OpenClaw to pick up the new skill
# 3. Install the plugin
openclaw plugins install @honcho-ai/openclaw-honcho
# 4. Restart the gateway
openclaw gateway restart
# 5. Open an agent session and invoke the skill
# The skill will prompt for your Honcho API key and run setup interactivelyIf you have existing workspace memory files (USER.md, MEMORY.md, IDENTITY.md, memory/, canvas/, etc.), openclaw honcho setup will detect them and offer to migrate them.
Migration is non-destructive — files are uploaded to Honcho. Originals are never deleted or moved.
User/owner files (content describes the user):
USER.md,MEMORY.md- All files in
memory/andcanvas/directories (treated as user content)
Agent/self files (content describes the agent):
SOUL.md,IDENTITY.md,AGENTS.md,TOOLS.md,BOOTSTRAP.md
Files are uploaded via session.uploadFile(). User/owner files go to the owner peer; agent/self files go to the agent peer (agent-{agentId}, e.g. agent-main).
The plugin ships template files in node_modules/@honcho-ai/openclaw-honcho/workspace_md/. Copy or merge these templates into your workspace for AGENTS.md, SOUL.md, and BOOTSTRAP.md. These templates reference the Honcho tools (honcho_context, honcho_search_conclusions, honcho_ask, honcho_search_messages, honcho_session) instead of the old file-based memory system.
Run openclaw honcho setup to configure interactively, or set values directly in ~/.openclaw/openclaw.json under plugins.entries["openclaw-honcho"].config.
| Key | Type | Default | Description |
|---|---|---|---|
apiKey |
string |
— | Honcho API key (required for managed; omit for self-hosted). |
workspaceId |
string |
"openclaw" |
Honcho workspace ID for memory isolation. |
baseUrl |
string |
"https://api.honcho.dev" |
API endpoint (for self-hosted instances). |
noisePatterns |
string[] |
built-in defaults | Patterns to skip messages. User-provided patterns are merged with built-in defaults (unless disableDefaultNoisePatterns is set). |
disableDefaultNoisePatterns |
boolean |
false |
When true, built-in noise patterns are not applied — only noisePatterns entries are used. |
ownerObserveOthers |
boolean |
false |
Whether the owner peer observes agent messages in Honcho's social model. |
Run openclaw honcho setup, enter a blank API key, and set the Base URL to your instance (e.g., http://localhost:8000).
For setting up a local Honcho server, see the Honcho local development guide.
The plugin automatically drops messages that match noise patterns before saving to Honcho. Built-in defaults filter:
HEARTBEAT_OK— assistant heartbeat acknowledgmentsA scheduled reminder has been triggered— cron reminder boilerplateExecute your Session Startup sequence now— session startup commandsQueued messages from— queued message wrapper headers
Add custom patterns via noisePatterns in your config:
{
"plugins": {
"entries": {
"openclaw-honcho": {
"config": {
"noisePatterns": ["my custom noise string"]
}
}
}
}
}Custom patterns are merged with the built-in defaults. Each pattern matches if the message equals it or starts with it. Patterns starting with / are treated as anchored regex (e.g., /^HEARTBEAT/i).
Honcho's observeOthers controls whether a peer forms representations of other peers based on messages it witnessed in shared sessions. The agent peer always has observeOthers: true — it sees and reasons about the user's messages. The owner (user) peer defaults to observeOthers: false — modeled only from what the user said, not what the agent replied.
Set ownerObserveOthers: true to let the owner peer also observe agent messages. This gives Honcho perspective-aware memory: the owner stores conclusions about the agent based only on what it witnessed, enabling the user's representation to reflect the full conversational context rather than just their own side of it.
Once installed, the plugin works automatically:
- Message Observation — After every AI turn, the conversation is persisted to Honcho. Both user and agent messages are observed, allowing Honcho to build and refine its models. Message capture starts when the plugin is active for a session, and preserves original timestamps for captured messages. Messages are also flushed before session compaction and
/new//reset, so no conversation data is lost. - Tool-Based Context Access — The AI can query Honcho mid-conversation using tools like
honcho_context,honcho_search_conclusions, andhoncho_askto retrieve relevant context about the user. Context is injected during OpenClaw'sbefore_prompt_buildphase, ensuring accurate turn boundaries. - Dual Peer Model — Honcho maintains separate representations: one for the user (preferences, facts, communication style) and one for the agent (personality, learned behaviors). Each OpenClaw agent gets its own Honcho peer (
agent-{id}), so multi-agent workspaces maintain isolated memory. - Clean Persistence — Platform metadata (conversation info, sender headers, thread context, forwarded messages) is stripped before saving to Honcho, ensuring only meaningful content is persisted. Noise messages (heartbeat acks, cron boilerplate, startup commands) are dropped entirely via configurable pattern filters.
Honcho handles all reasoning and synthesis in the cloud.
OpenClaw uses a multi-agent architecture where a primary agent can spawn subagents to handle specialized tasks. The Honcho plugin is fully aware of this hierarchy:
- Automatic Subagent Detection — When OpenClaw spawns a subagent, the plugin tracks the parent→child relationship via the
subagent_spawnedhook. Each subagent session records itsparentPeerIdin metadata. - Parent Observer Peer — The spawning agent is added as a silent observer in the subagent's Honcho session (
observeMe: false, observeOthers: true). This gives Honcho visibility into the full agent tree — the parent can see what its subagents are doing without its own messages being attributed to the subagent session.
The plugin manages markdown files in your workspace:
| File | Contents |
|---|---|
SOUL.md |
Agent profile — OpenClaw's self-model and personality. |
IDENTITY.md |
Static agent identity. Uploaded to the agent peer in Honcho during setup; the local file is not modified. |
AGENTS.md |
Agent capabilities and tool descriptions. |
TOOLS.md |
Tool definitions and usage instructions for the agent. |
BOOTSTRAP.md |
Initial context and instructions for the agent. |
Migration: Legacy files (USER.md, MEMORY.md, memory/ directory) are uploaded to Honcho during openclaw honcho setup. Originals are preserved in place.
The plugin provides 5 tools — 3 data retrieval (cheap, no LLM) and 2 interactive (LLM-powered).
| Tool | Type | Description |
|---|---|---|
honcho_context |
Data | User knowledge across all sessions. detail='card' for key facts, 'full' for broad representation. |
honcho_search_conclusions |
Data | Semantic vector search over stored conclusions. Returns raw memories ranked by relevance. |
honcho_search_messages |
Data | Find specific messages across all sessions. Filter by sender (user/agent/all), date, metadata. |
honcho_session |
Data | Current session history and summary. Supports semantic search within the session. |
honcho_ask |
Q&A | Ask Honcho a question about the user. depth='quick' for facts, 'thorough' for synthesis. |
openclaw honcho setup # Configure API key and migrate legacy files
openclaw honcho status # Show current installation and setup state
openclaw honcho ask <question> # Query Honcho about the user
openclaw honcho search <query> [-k N] [-d D] # Semantic search over memory (topK, maxDistance)This plugin automatically exposes OpenClaw's memory_search and memory_get tools when a memory backend is configured. This allows you to use both Honcho's cloud-based memory AND local file search together.
-
Install QMD on your server (QMD documentation)
-
Configure OpenClaw to use QMD as the memory backend in
~/.openclaw/openclaw.json:
{
"memory": {
"backend": "qmd",
"qmd": {
"limits": {
"timeoutMs": 120000
}
}
}
}- Set up QMD collections for your files:
qmd collection add ~/Documents/notes --name notes
qmd update- Restart OpenClaw:
openclaw gateway restartWhen QMD is configured, you get both Honcho and local file tools:
| Tool | Source | Description |
|---|---|---|
honcho_* |
Honcho | Cross-session memory, user modeling, dialectic reasoning |
memory_search |
QMD | Search local markdown files |
memory_get |
QMD | Retrieve file content |
OpenClaw runs as a systemd service with a different PATH. Create a symlink:
sudo ln -s ~/.bun/bin/qmd /usr/local/bin/qmdQMD operations can take a while, especially first-time queries that download ~2GB of models. Increase the timeout in ~/.openclaw/openclaw.json:
{
"memory": {
"qmd": {
"limits": {
"timeoutMs": 120000
}
}
}
}The default timeout is 4000ms which depending on your hardware may be too short and cause errors. Setting it to 120000ms (2 minutes) gives QMD enough time. You can verify it's working in the logs:
19:09:02 tool start: memory_search
19:09:14 tool end: memory_search # 12 seconds — within the 120s limit
You can also pre-warm QMD to avoid first-run delays:
qmd query "test"See CONTRIBUTING.md for development setup, building from source, and contribution guidelines.
- GitHub Issues: Open an Issue
- Discord: Join the Community
- X (Twitter): Follow @honchodotdev
- Blog: Read about Honcho and Agents
