A command-line AI assistant that reads your resume, searches for jobs, generates tailored cover letters, emails, and ATS-optimised resumes, and produces professional PDFs with match-score radar charts.
Supports Ollama (fully local), OpenAI, and Anthropic — swap providers with a single .env variable.
- Resume Q&A — ask anything about your own resume via RAG (ChromaDB + embeddings)
- Job search — searches the web for relevant job postings via DuckDuckGo
- Job description scraping — fetches and parses job postings from any URL
- Document generation — produces cover letters, outreach emails, and ATS-friendly resumes
- Match analysis — scores your profile against a job description and renders a radar chart
- PDF export — all outputs saved as formatted PDFs in
output/ - Multi-provider — LLM and embedding provider are independently configurable
.
├── main.py # Agent loop, tools wiring, CLI
├── vector.py # ChromaDB setup, PDF indexing, retriever
├── tools.py # LangChain tools: web search, PDF scraping, PDF generation, radar chart
├── files/
│ └── profile.pdf # Your resume (add your own)
├── output/ # Generated PDFs (git-ignored)
└── .env # Configuration (git-ignored — see .env.example)
- Python 3.11+
- Ollama installed and running locally (if using the
ollamaprovider)
git clone <repo-url>
cd hr-assistantpython -m venv .venv
source .venv/bin/activate # macOS/Linux
.venv\Scripts\activate # Windowspip install -r requirements.txtPlace your resume as files/profile.pdf. This file is git-ignored — it will never be committed.
Copy the example and fill in your values:
cp .env.example .envThen edit .env:
# LLM Provider: ollama | openai | anthropic
LLM_PROVIDER=ollama
# Ollama (local)
OLLAMA_MODEL=qwen3:14b
# OpenAI
OPENAI_API_KEY=your-key-here
OPENAI_MODEL=gpt-4o
# Anthropic
ANTHROPIC_API_KEY=your-key-here
ANTHROPIC_MODEL=claude-sonnet-4-6
# Embedding Provider: ollama | openai
EMBEDDING_PROVIDER=ollama
OLLAMA_EMBEDDING_MODEL=nomic-embed-text
# OPENAI_EMBEDDING_MODEL=text-embedding-3-small
# Optional debug output
VERBOSE=falseollama pull qwen3:14b
ollama pull nomic-embed-textpython main.pyThe assistant starts an interactive loop. Example prompts:
How many years of experience do I have?
What are my strongest technical skills?
https://example.com/jobs/software-engineer-123
Search for senior Python developer jobs in London posted in the last month
When you paste a job URL, the assistant will automatically:
- Fetch the job description
- Write a tailored cover letter
- Write an outreach email to the hiring manager
- Generate an ATS-optimised resume
- Produce a match-score analysis with a radar chart
- Save all four documents as PDFs in
output/
Type exit to quit.
| Provider | LLM quality | Embedding quality | Cost | Privacy |
|---|---|---|---|---|
| Ollama | Good (model-dependent) | Good | Free | Fully local |
| OpenAI | Excellent | Excellent | Pay-per-use | Cloud |
| Anthropic | Excellent | — (use OpenAI embeddings) | Pay-per-use | Cloud |
Note: Each embedding provider uses its own separate ChromaDB collection (
chroma_db_ollama,chroma_db_openai). Switching providers re-indexes your resume automatically on first run.
attempt to write a readonly database
ChromaDB 1.5+ requires WAL journal mode. The app handles this automatically on startup. If you see this error, make sure you are running the latest code.
No module named 'pypdf'
pip install pypdfOllama not responding
Make sure Ollama is running (ollama serve) and the model is pulled (ollama list).
Generated documents are saved to output/ with the naming convention:
<company>_<job_title>_<YYYY-MM-DD_HH-mm-ss>.cover_letter.pdf
<company>_<job_title>_<YYYY-MM-DD_HH-mm-ss>.email.pdf
<company>_<job_title>_<YYYY-MM-DD_HH-mm-ss>.resume.pdf
<company>_<job_title>_<YYYY-MM-DD_HH-mm-ss>.analysis.pdf
MIT