An offline AI coding assistant for Raspberry Pi 5. Works without internet - perfect for airplanes.
local-code is a wrapper around aider, which lets you chat with an AI that can:
- Read your code files
- Write and edit files for you
- Suggest shell commands to run
- Auto-commit changes to git
It runs entirely on your Pi using Ollama - no cloud, no API keys, no internet needed.
cd ~/your-project
local-codeThen just tell it what you want:
- "add a login page"
- "fix the bug in auth.py"
- "write tests for the utils module"
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ local-code │────▶│ aider │────▶│ ollama │
│ (script) │ │ (AI coding) │ │ (runs LLM) │
└─────────────┘ └─────────────┘ └─────────────┘
- Ollama runs the AI model locally on your Pi's CPU
- Aider talks to Ollama and handles file reading/writing
- local-code is a simple script that launches aider with the right settings
local-code # Start coding assistant
local-code -m qwen2.5-coder:3b # Use bigger/smarter model (slower)
local-code --status # Check if everything is running
local-code --list # Show available models
local-code --help # Show all optionsOnce running, you can:
/add file.py- Add a file to the conversation/drop file.py- Remove a file/run npm test- Run a command/diff- See pending changes/help- All commands/exitor Ctrl+C - Quit
| Model | Size | Speed | Quality |
|---|---|---|---|
| qwen2.5-coder:1.5b | 986 MB | Fast | Good (default) |
| qwen2.5-coder:3b | 1.9 GB | Slower | Better |
| codegemma:2b | 1.6 GB | Medium | Good |
Add more models with:
ollama pull <model-name>~/dev/local-code/
├── lc # The main script
└── README.md # This file
~/.local/bin/
└── local-code -> ~/dev/local-code/lc # Symlink so you can run it anywhere
"Ollama not running"
sudo systemctl start ollamaSlow responses
- First response is always slow (model loading)
- Use the 1.5b model (default) for speed
- Responses stream as they generate
Model not found
ollama pull qwen2.5-coder:1.5bcurl -fsSL https://ollama.com/install.sh | shollama pull qwen2.5-coder:1.5bcurl -LsSf https://astral.sh/uv/install.sh | sh~/.local/bin/uv tool install --python 3.12 aider-chatmkdir -p ~/.local/bin
ln -sf ~/dev/local-code/lc ~/.local/bin/local-codeAdd this to your ~/.bashrc or ~/.zshrc:
export PATH="$HOME/.local/bin:$PATH"Then restart your terminal or run source ~/.bashrc.
local-code --status- Raspberry Pi 5 (8GB recommended) or similar ARM64 Linux
- Ollama
- Aider (via uv)
- ~1-2 GB disk space per model