Skip to content

flaviofrancisco/ollama-resume-analysis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HR Assistant — AI-Powered Resume & Career Coach

A command-line AI assistant that reads your resume, searches for jobs, generates tailored cover letters, emails, and ATS-optimised resumes, and produces professional PDFs with match-score radar charts.

Supports Ollama (fully local), OpenAI, and Anthropic — swap providers with a single .env variable.


Features

  • Resume Q&A — ask anything about your own resume via RAG (ChromaDB + embeddings)
  • Job search — searches the web for relevant job postings via DuckDuckGo
  • Job description scraping — fetches and parses job postings from any URL
  • Document generation — produces cover letters, outreach emails, and ATS-friendly resumes
  • Match analysis — scores your profile against a job description and renders a radar chart
  • PDF export — all outputs saved as formatted PDFs in output/
  • Multi-provider — LLM and embedding provider are independently configurable

Project Structure

.
├── main.py          # Agent loop, tools wiring, CLI
├── vector.py        # ChromaDB setup, PDF indexing, retriever
├── tools.py         # LangChain tools: web search, PDF scraping, PDF generation, radar chart
├── files/
│   └── profile.pdf  # Your resume (add your own)
├── output/          # Generated PDFs (git-ignored)
└── .env             # Configuration (git-ignored — see .env.example)

Requirements

  • Python 3.11+
  • Ollama installed and running locally (if using the ollama provider)

Setup

1. Clone the repo

git clone <repo-url>
cd hr-assistant

2. Create and activate a virtual environment

python -m venv .venv
source .venv/bin/activate      # macOS/Linux
.venv\Scripts\activate         # Windows

3. Install dependencies

pip install -r requirements.txt

4. Add your resume

Place your resume as files/profile.pdf. This file is git-ignored — it will never be committed.

5. Configure environment variables

Copy the example and fill in your values:

cp .env.example .env

Then edit .env:

# LLM Provider: ollama | openai | anthropic
LLM_PROVIDER=ollama

# Ollama (local)
OLLAMA_MODEL=qwen3:14b

# OpenAI
OPENAI_API_KEY=your-key-here
OPENAI_MODEL=gpt-4o

# Anthropic
ANTHROPIC_API_KEY=your-key-here
ANTHROPIC_MODEL=claude-sonnet-4-6

# Embedding Provider: ollama | openai
EMBEDDING_PROVIDER=ollama
OLLAMA_EMBEDDING_MODEL=nomic-embed-text
# OPENAI_EMBEDDING_MODEL=text-embedding-3-small

# Optional debug output
VERBOSE=false

6. Pull Ollama models (if using Ollama)

ollama pull qwen3:14b
ollama pull nomic-embed-text

Usage

python main.py

The assistant starts an interactive loop. Example prompts:

How many years of experience do I have?
What are my strongest technical skills?
https://example.com/jobs/software-engineer-123
Search for senior Python developer jobs in London posted in the last month

When you paste a job URL, the assistant will automatically:

  1. Fetch the job description
  2. Write a tailored cover letter
  3. Write an outreach email to the hiring manager
  4. Generate an ATS-optimised resume
  5. Produce a match-score analysis with a radar chart
  6. Save all four documents as PDFs in output/

Type exit to quit.


Provider Comparison

Provider LLM quality Embedding quality Cost Privacy
Ollama Good (model-dependent) Good Free Fully local
OpenAI Excellent Excellent Pay-per-use Cloud
Anthropic Excellent — (use OpenAI embeddings) Pay-per-use Cloud

Note: Each embedding provider uses its own separate ChromaDB collection (chroma_db_ollama, chroma_db_openai). Switching providers re-indexes your resume automatically on first run.


Troubleshooting

attempt to write a readonly database
ChromaDB 1.5+ requires WAL journal mode. The app handles this automatically on startup. If you see this error, make sure you are running the latest code.

No module named 'pypdf'

pip install pypdf

Ollama not responding
Make sure Ollama is running (ollama serve) and the model is pulled (ollama list).


Output Files

Generated documents are saved to output/ with the naming convention:

<company>_<job_title>_<YYYY-MM-DD_HH-mm-ss>.cover_letter.pdf
<company>_<job_title>_<YYYY-MM-DD_HH-mm-ss>.email.pdf
<company>_<job_title>_<YYYY-MM-DD_HH-mm-ss>.resume.pdf
<company>_<job_title>_<YYYY-MM-DD_HH-mm-ss>.analysis.pdf

License

MIT

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages