Skip to content

brienen/scriptorium-template

Repository files navigation

scriptorium-template

🇬🇧 English · 🇳🇱 Nederlands

The structured writing system for people who own every claim they publish.

A GitHub template that turns any capable LLM (Claude Code, Codex, Gemini, Cursor, Aider, or anything with file access) into a phase-gated writing coach, not a text generator, for reports, strategies, scientific articles, and opinion pieces.

Karpathy's pattern compounds what you read. Scriptorium compounds what you publish.

Structured writing with scriptorium

The problem this solves

AI writing tools have created a new workplace crisis: workslop: content that looks polished, reads fluently, and collapses under a single follow-up question. Stanford and BetterUp researchers found 40% of workers receive it monthly. It erodes trust, shifts burden to readers, and makes "the LLM wrote it" sound like an excuse rather than a workflow.

Scriptorium does the opposite. The LLM asks questions, surfaces gaps, marks assumptions, and refuses to advance until the current phase is actually finished. You keep authorship. The coach enforces method.


Who this is for

Policy analysts and government professionals — writing reports and strategy documents where every claim must survive an audit. Source IDs trace each sentence to a file. Assumption IDs mark where evidence runs out. Private-fork-by-default and paraphrase-over-verbatim handle sensitive material.

Researchers and PhD candidates — writing articles, dissertations, and grant proposals where reviewers ask "how do you know this?". The answer is a filename. SemVer draft versioning means you always know which version went to which reviewer.

Consultants and advisers — producing deliverables for clients who will push back. Triple review (substantive, methodological, linguistic) with blocking/advisable/noted findings makes the review conversation explicit rather than implicit.


Why you want this

  • You keep authorship; the LLM enforces method. The coach asks questions, surfaces gaps, marks assumptions, and refuses to advance until the current phase is actually finished. No more 1,200 words of fluent nothing.
  • Every claim is traceable. Each sentence cites a source ID (S03) or an assumption ID (A02). Nothing floats. When a reviewer asks "how do you know that?", the answer is a filename.
  • Drafts have real version discipline. SemVer for prose: a new source bumps PATCH, a structural rewrite bumps MINOR, a change of core message bumps MAJOR and returns you to Phase 1. Every version is immutable.
  • Triple review, not just proofreading. Substantive (does the argument work?), methodological (does it obey genre conventions?), linguistic (does it read well?). Findings are marked blocking, advisable, or noted.
  • Message delivery, not just message drafting. The coach applies modern communication techniques, pyramidal structure (Minto), storytelling arcs, and audience-first framing, so the finished piece actually lands with the reader rather than just existing on the page.
  • Knowledge compounds. Durable findings promote from a finished piece into persistent-knowledge/ and feed the next deliverable's Phase 0.
  • Confidential material stays confidential. Private-fork-by-default, paraphrase-over-verbatim for non-public sources. Read PRIVACY.md before your first deliverable with sensitive material.
  • Runtime language follows you; committed files stay English. Work in Dutch, German, French, or Japanese; ship an English-readable repository.

How this relates to Karpathy's LLM wiki pattern

In April 2026, Andrej Karpathy published a GitHub Gist describing a three-folder markdown setup where an LLM builds and maintains a persistent, compounding knowledge base. The post reached 16 million views in days.

Scriptorium is the next level on top of Karpathy's idea. The wiki layer is still here — still compounding knowledge across sessions — but it now feeds something: the deliverable, a finished written artefact produced under a six-phase coach protocol with source IDs, assumption IDs, SemVer drafts, and triple review. Phase 5 promotions feed back into the wiki so the next piece starts smarter than the last.

Use this template when you need to produce something. If you only want the wiki, the upstream bashiraziz/llm-wiki-template is the right tool.


How easy is it to start

Press Use this template on GitHub, clone the result, open your tool, and say: "set me up".

The onboarding flow detects your language, asks whether the repository is public or private, walks you through four questions about your first deliverable (genre, title, audience, constraints), and creates the folder. Then it hands off to Phase 0. No config files to edit by hand.


The six phases

For every deliverable, the coach walks you through six phases with hard-stop confirmation gates:

  1. Context — sources entered with stable IDs; assumptions made explicit.
  2. Core message — one sentence the reader should retain.
  3. Structure — pyramidal outline, each claim backed by a source or declared assumption.
  4. Drafting — versioned drafts (SemVer), new source defaults to PATCH.
  5. Review — substantive, methodological, linguistic passes with blocking / advisable / noted findings.
  6. Meta-evaluation — which assumption collapses the argument if false; which findings promote to persistent knowledge.

Every phase ends at an explicit gate. The coach will not advance without your confirmation.


Two layers, one repo

┌─────────────────────────────────────────────────────────┐
│  deliverables/          primary. one folder per piece.  │
│    CONTEXT.md, sources/, drafts/, reviews/, final.md    │
└──────────────────────────┬──────────────────────────────┘
                           │  promotions at Phase 5
                           ▼
┌─────────────────────────────────────────────────────────┐
│  persistent-knowledge/  coach-curated, cross-deliverable │
│  wiki/                  free-form knowledge accumulation │
│  raw/                   immutable source documents       │
└─────────────────────────────────────────────────────────┘

The writing layer (deliverables/) is primary. The knowledge layer (persistent-knowledge/, wiki/, raw/) feeds sources into Phase 0 and absorbs durable findings after Phase 5. The two layers coexist; neither overwrites the other.

Publication language vs runtime language

All files committed to this repository are written in English. The coach renders its interactive output in the author's working language, detected from the first message or declared in deliverables/<project>/CONTEXT.md. Dutch, German, French, or any other language works at runtime; only what lives in git stays English.

Supported genres

Genre File
Report genres/report/GENRE.md
Strategy genres/strategy/GENRE.md
Scientific article (APA 7) genres/scientific-article/GENRE.md
Opinion piece genres/opinion-piece/GENRE.md

Each file declares audience, structure, style, citation convention, phase mapping, and review matrix. Adding a new genre means adding one GENRE.md.

Supported tools

One canonical protocol at protocol/WRITING-COACH.md; thin adapters per tool.

Tool Adapter Loading
Claude Code adapters/claude-code/CLAUDE.md + .claude/skills/writing-coach/SKILL.md Skill loads on writing triggers
OpenAI Codex CLI root AGENTS.md Auto-loaded
Gemini CLI adapters/gemini/GEMINI.md Copy/symlink to root
Cursor adapters/cursor/.cursor/rules/writing-coach.mdc Glob-scoped to deliverables/
Aider, generic root AGENTS.md or paste-in system prompt See adapters/generic/README.md

For any LLM with file-reading capability, the generic adapter explains how to paste protocol/WRITING-COACH.md as a system prompt.

Quickstart

# 1. Use this template on GitHub (private visibility by default),
#    clone your new repo.
git clone git@github.com:YOU/your-scriptorium.git
cd your-scriptorium

# 2. Install the adapter for your tool (example: Claude Code)
cp adapters/claude-code/CLAUDE.md .
cp -r adapters/claude-code/.claude .

# 3. Open your tool in the repo and say: "set me up".
#    The onboarding flow handles everything: language detection,
#    visibility question, four-question wizard, first deliverable,
#    and handoff to Phase 0.

Prefer the manual path? Copy deliverables/example-opinion-piece/CONTEXT.md to deliverables/my-first-piece/CONTEXT.md, edit it, then tell the coach to begin Phase 0.

A worked end-to-end example lives at deliverables/example-opinion-piece/.

How this compares to other wiki templates

Template Primary purpose Choose it when
karpathy/ gist Reference pattern You want the original description and will build your own.
bashiraziz/llm-wiki-template Polished implementation of the wiki pattern Your goal is knowledge accumulation across open-ended reading.
nvk/llm-wiki Lightweight personal wiki variant You want a minimal, opinionated take without session-export machinery.
lucasastorian/llmwiki Code-assistant-flavoured wiki Your domain is software engineering notes.
nashsu/llm_wiki Research-notebook variant Your workflow centres on paper reading and summarisation.
scriptorium-template Phase-gated writing coach with wiki as support layer You need to produce finished reports, strategies, scientific articles, or opinion pieces, with source traceability and review discipline.

Repository layout

scriptorium-template/
├── protocol/                   canonical source of truth (WRITING-COACH.md)
├── genres/                     genre conventions (report, strategy, …)
├── deliverables/               your writing projects (one per piece)
│   └── example-opinion-piece/  worked end-to-end example
├── persistent-knowledge/       coach-curated cross-deliverable store
├── wiki/                       free-form knowledge base (Karpathy pattern)
├── raw/                        immutable source documents
├── sessions/                   auto-exported session transcripts (gitignored)
├── adapters/                   tool-specific adapters
│   ├── claude-code/            skill + CLAUDE.md
│   ├── codex/                  pointer to root AGENTS.md
│   ├── gemini/                 GEMINI.md
│   ├── cursor/                 Cursor rules
│   └── generic/                any other LLM
├── scripts/                    session export, indexing, project wiring
├── docs/                       Obsidian, Windows, cross-project wiring
├── examples/                   domain examples for the wiki layer
├── AGENTS.md                   root-level adapter for Codex, Aider, Cursor fallback
├── CLAUDE.md                   Claude Code entry point (copied from adapter)
├── PRIVACY.md                  source handling and public-fork rules
├── CONTRIBUTING.md             fork-as-template and upstream PR workflow
├── CHANGELOG.md
├── LICENSE                     CC BY 4.0 for prose, MIT for code
├── README.md                   this file
└── SETUP-GUIDE.md              full setup (wiki layer, hooks, multi-device)

Confidentiality

If you are writing with government, client, or NDA material, read PRIVACY.md before your first deliverable. The default for a freshly cloned template is private visibility. The coach asks about repository visibility in Phase 0 and prefers paraphrase over verbatim for non-public sources.

Licence

Dual-licensed. CC BY 4.0 for prose, templates, and documentation. MIT for code (scripts, hooks, configuration). Your own deliverables are yours; the licence only applies to template content. See LICENSE.

Contributing

See CONTRIBUTING.md. New tool adapters and new genres are especially welcome.

Acknowledgments

Pattern by Andrej Karpathy. Wiki-layer implementation derived from bashiraziz/llm-wiki-template. Writing-coach protocol and genre conventions original to this template.


scriptorium-template · a writing coach, not a text generator.

About

Scriptorium-template · a writing coach, not a text generator

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors