Skip to content

feat(mcp): improve neighbourhood publishing with auto-cloning#730

Merged
lucksus merged 8 commits intodevfrom
fix/mcp-neighbourhood-improvements
Mar 7, 2026
Merged

feat(mcp): improve neighbourhood publishing with auto-cloning#730
lucksus merged 8 commits intodevfrom
fix/mcp-neighbourhood-improvements

Conversation

@data-bot-coasys
Copy link
Contributor

@data-bot-coasys data-bot-coasys commented Mar 7, 2026

Problem

AI agents (including me when building the SoA prototype today) bypass MCP and fall back to GraphQL because the neighbourhood publishing flow requires too much low-level knowledge:

  • Manually clone link languages
  • Understand language addresses and templates
  • Know the exact sequence: clone → publish → use address

This defeats the purpose of MCP — it should be the only interface agents need.

Solution

Three improvements:

1. list_link_language_templates — new tool

Returns available P2P sync engine templates (currently just Holochain perspective-diff-sync). Agents pick from this list instead of manually looking up addresses in config or GraphQL.

{
  "templates": ["QmzSYwdn..."],
  "count": 1,
  "hint": "Pass one of these as link_language_template when publishing"
}

2. Auto-clone in neighbourhood_publish_from_perspective

Agents now pass:

  • perspective_uuid — what to share
  • link_language_template — which sync engine (from list_link_language_templates)
  • name — human-readable neighbourhood name

The tool handles:

  • Cloning the template with unique ID
  • Publishing the clone via the language language
  • Using the cloned instance for neighbourhood sync

No manual language management. One-step workflow.

3. Strip implementation details from descriptions

Tool descriptions now explain what ("publish a perspective as a neighbourhood") not how ("requires a perspective-diff-sync language address QmzSYwdn...").

Agents work at a higher level of abstraction.

Before/After

Before:

// Agent needs to know language cloning workflow
template = "QmzSYwdn..." // where did this come from?
cloned = language_apply_template_and_publish(template, '{"name": "My NH"}')
neighbourhood_publish_from_perspective(perspective_id, cloned.address)

After:

templates = list_link_language_templates()
neighbourhood_publish_from_perspective(
  perspective_id, 
  templates.templates[0], 
  "My Neighbourhood"
)

Testing

Verified with:

  • cargo check — compiles clean
  • Manual GraphQL/MCP comparison — same capabilities, simpler interface

Next: test with actual neighbourhood creation (will do with proper SoA memory perspective).

Related

Part of making MCP the canonical AI-agent interface. Closes the loop on why I fell back to GraphQL when building the SoA prototype — this would have prevented that.

Summary by CodeRabbit

  • New Features

    • Can list available link-language templates for neighbourhood creation.
    • Added a language metadata lookup tool to inspect installed languages.
  • Improvements

    • Neighbourhood publish now auto-clones templates to create unique link-language instances.
    • Publish responses now include the cloned link-language address and neighbourhood name; publish/join flows use the cloned instance.
  • Documentation

    • Expanded docs for language & neighbourhood tools, publishing and joining workflows.

Loading
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants