A headless coding agent CLI designed for one-shot prompts or loops using Vercel AI SDK and AI Gateway.
- No interactive CLI
- A minimal set of tools
- The
just-bashlibrary exposes only the current working directory
Batteries not included. Designed to be learned and forked to suit your workflow.
-
Clone the repository:
git clone https://github.com/AAorris/coda.git ~/my-coda # or wherever you prefer cd ~/my-coda
-
Install dependencies:
pnpm install
-
Set up your API key: Create an AI Gateway API key and write it to
.env.localasAI_GATEWAY_API_KEY=... -
Create the binary (optional): ./install.sh (enables you to use
codaon the CLI)
For now, the installation runs your local install using npx tsx - You are encouraged to fork and develop the codebase into your own agent.
git pull
Run coda with a prompt:
coda "Create a hello.txt file with 'Hello, World!'"Configure your API key interactively:
coda setupcoda "prompt"- Run agent with a promptcoda "@file.txt"- Run agent with a prompt from a filecoda -m "model-name" -p "prompt"- Use specific modelcoda --help- Show helpcoda --version- Show version
The agent has access to:
- Bash — Run shell commands in a sandboxed
/workspaceenvironment - Edit — Surgical file edits via exact string replacement
- ReadFile / WriteFile — File I/O operations
Coda runs your prompt through an AI model with file system tools. It operates in a sandboxed environment mapping /workspace to your current directory, keeping operations safe and contained.
The edit tool enforces exact string matching — no fuzzy replacements. This ensures precision when modifying code.
The codebase follows Unix philosophy: agent produces events → handlers consume events.
┌─────────────┐ events ┌──────────────────┐
│ Agent │ ───────────────▶│ EventBus │
│ (producer) │ │ (broadcaster) │
└─────────────┘ └────────┬─────────┘
│
┌────────────────────┼───────────────────┐
▼ ▼ ▼
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Console │ │ File │ │ Custom │
│ Handler │ │ Handler │ │ Handler │
└─────────────┘ └─────────────┘ └─────────────┘
- Agent: Runs the AI model, emits events for each step (tool calls, text output, completion)
- EventBus: Broadcasts events to all registered handlers using
Promise.allSettledfor isolation - Handlers: Process events independently (console output, file logging, etc.)
This design keeps concerns separated—the agent doesn't know about output formatting, and handlers don't know about AI internals.
Coda looks for configuration in this order:
- Environment variables (
AI_GATEWAY_API_KEY) ~/.coda/.env.localfile- Local
.envfiles (for development)
The config file uses standard .env format:
AI_GATEWAY_API_KEY=vck_...
ISC