Web-based UI for labelling cells in microscopy images using SAM2.1 (Segment Anything Model). Designed for creating training data for downstream segmentation models.
- Click-to-segment: Left-click for positive points, right-click for negative points
- Multi-point prompting: Refine segmentation with multiple clicks before saving
- Keyboard-driven workflow: Extensive shortcuts for efficient labelling
- Multi-class support: Define cell types with color-coded masks
- Undo/Redo: Full annotation history
- Low-contrast optimization: CLAHE enhancement for microscopy images
- CPU/GPU support: Automatic device detection with CPU fallback
cd backend
uv sync
uv run uvicorn app.main:app --reloadThe API will be available at http://localhost:8000 (docs at http://localhost:8000/docs).
cd frontend
npm install
npm run devThe UI will be available at http://localhost:5173.
USE_MOCK_SAM=true uv run uvicorn app.main:app --reload| Shortcut | Action |
|---|---|
Left Click |
Add positive point |
Right Click |
Add negative point |
Enter |
Save segmentation |
Backspace |
Remove last point |
Esc |
Clear all points |
Ctrl+Z |
Undo |
Ctrl+Y |
Redo |
N / → |
Next image |
P / ← |
Previous image |
1-9 |
Select class |
V |
Toggle annotations |
H / ? |
Show help |
- Backend: FastAPI (Python) - REST API for image serving, SAM inference, and annotation management
- Frontend: React + Vite (JavaScript) - Interactive canvas for click-to-segment labelling
- Model: SAM2.1 (sam2.1-hiera-small) with CLAHE contrast enhancement
cd backend
uv run ruff check . # Lint
uv run ruff check --fix . # Auto-fix
uv run basedpyright # Type checkcd backend
uv run pytest # All tests
uv run pytest -k "test_name" # By patternpre-commit install # Set up (once)
pre-commit run --all-files # Run manuallyPlace your image data in data/raw/. The backend expects a ZIP file with image slices that will be extracted automatically.