Skip to content

ORNL/ChatHPC-app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

237 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ChatHPC Application

DOI

GitHub Status:

GitHub CI GitHub Release

GitLab Status (requires access and sign-in to code.ornl.gov):

GitLab CI Coverage GitLab Release

Documentation: https://chathpc-app.readthedocs.io/

Internal Development Documentation: https://devdocs.ornl.gov/ChatHPC/ChatHPC-app

Internal Coverage Report: https://devdocs.ornl.gov/ChatHPC/ChatHPC-app/coverage

Creating a new ChatHPC application.

Table of Contents

Installation

For development in folder:

git clone git@github.com:ORNL/ChatHPC-app.git
cd ChatHPC-app
python3 -m venv --upgrade-deps --prompt $(basename $PWD) .venv
source .venv/bin/activate
pip install -e .

For use in virtual environment:

python3 -m venv --upgrade-deps --prompt $(basename $PWD) .venv
source .venv/bin/activate
pip install git+ssh://git@github.com/ORNL/ChatHPC-app.git

Docker Usage

Pull and Run Pre-built Image

Pull the latest image from GitHub Container Registry:

docker pull ghcr.io/ornl/chathpc-app:latest

Run the ChatHPC CLI:

# Show help
docker run --rm ghcr.io/ornl/chathpc-app:latest

# Run with your data (mount volumes as needed)
docker run --rm -v $(pwd):/data ghcr.io/ornl/chathpc-app:latest chathpc --config /data/config.json

# Interactive shell
docker run --rm -it ghcr.io/ornl/chathpc-app:latest /bin/bash

GPU Support

To use GPU acceleration with the Docker container, you need to install the NVIDIA Container Toolkit and pass GPU access to Docker.

Install NVIDIA Container Toolkit:

# Add NVIDIA package repositories
distribution=$(. /etc/os-release;echo $ID$VERSION_ID)
curl -s -L https://nvidia.github.io/nvidia-docker/gpgkey | sudo apt-key add -
curl -s -L https://nvidia.github.io/nvidia-docker/$distribution/nvidia-docker.list | sudo tee /etc/apt/sources.list.d/nvidia-docker.list

# Install nvidia-container-toolkit
sudo apt-get update
sudo apt-get install -y nvidia-container-toolkit

# Restart Docker daemon
sudo systemctl restart docker

Run with GPU access:

# Run with all GPUs
docker run --rm --gpus all ghcr.io/ornl/chathpc-app:latest

# Run with specific GPU(s)
docker run --rm --gpus '"device=0"' ghcr.io/ornl/chathpc-app:latest
docker run --rm --gpus '"device=0,1"' ghcr.io/ornl/chathpc-app:latest

# Run with GPU and mount data
docker run --rm --gpus all -v $(pwd):/data ghcr.io/ornl/chathpc-app:latest chathpc --config /data/config.json

# Verify GPU access inside container
docker run --rm --gpus all ghcr.io/ornl/chathpc-app:latest nvidia-smi

Note: The Docker image includes CUDA libraries, but the host system must have compatible NVIDIA drivers installed.

Build Image Locally

Build the Docker image from the repository:

git clone git@github.com:ORNL/ChatHPC-app.git
cd ChatHPC-app
docker build -t chathpc-app .

Run locally built image:

docker run --rm chathpc-app

Available Tags

  • latest - Latest build from main branch
  • <branch-name> - Latest build from specific branch
  • <sha> - Specific commit SHA

Example:

docker pull ghcr.io/ornl/chathpc-app:main
docker pull ghcr.io/ornl/chathpc-app:abc1234

Setup pre-commit Git hooks

Use hatch or install pre-commit inside python virtual environment.

hatch shell

or

pip install pre-commit

Then install the hooks.

pre-commit install

Note: You might have to upgrade pre-commit.

pre-commit autoupdate

Note: The markdown linter requires Ruby gem to be installed to auto-install and run mdl.

On Ubuntu this can be done with:

sudo apt install ruby-full

Quick Start

See Creating a new ChatHPC application.

CLI Interface

ChatHPC

Get Help:

$ chathpc --help
Usage: chathpc [OPTIONS] COMMAND [ARGS]...

Options:
  -h, --help  Show this message and exit.

Commands:
  config      Print current config
  run         Interact with the model.
  run-base    Interact with the base model.
  run-fine    Interact with the finetuned model.
  run-merged  Interact with the merged model.
  train       Finetune the model.

Run interactively:

chathpc run

Example interactive session:

$ chathpc run
chathpc ()> /context
Context: Introduction to Kokkos programming model
chathpc (Introduction to Kokkos programming model)> Which kind of Kokkos views are?
<s> You are a powerful LLM model for Kokkos. Your job is to answer questions about Kokkos programming model. You are given a question and context regarding Kokkos programming model.

You must output the Kokkos question that answers the question.

### Input:
Which kind of Kokkos views are?

### Context:
Introduction to Kokkos programming model

### Response:
There are two different layouts; LayoutLeft and LayoutRight.
</s>
chathpc (Introduction to Kokkos programming model)> \bye

Train:

chathpc train

ChatHPC JSON to MD

Get Help:

$ chathpc-json-to-md -h
usage: chathpc-json-to-md [-h] [--debug] [--log_level LOG_LEVEL] [--add_rating_template] [json]

Convert Json files to Markdown for ease of reading.

positional arguments:
  json                  Json string or path to json file.

options:
  -h, --help            show this help message and exit
  --debug               Open debug port (5678).
  --log_level LOG_LEVEL
                        Log level.
  --add_rating_template
                        Add rating template to markdown.

Example:

chathpc-json-to-md input.json > output.md

Upgrade packages in uv.lock

See Upgrading locked package versions.

With an existing uv.lock file, uv will prefer the previously locked versions of packages when running uv sync and uv lock. Package versions will only change if the project's dependency constraints exclude the previous, locked version.

To upgrade all packages:

uv lock --upgrade

Running with hatch

hatch shell

Testing with hatch

hatch run test

To test on all python versions:

hatch run all:test

Run tests and print the output.

hatch run test -v -s

Format code with hatch

hatch fmt

Update default ruff rules:

hatch fmt --check --sync

View version with hatch

hatch version

Update version with hatch

hatch version <new version>

Update version with script

An automated script is provided to update the version using a date based version. This scripts will determine the next version to use and then update the version, update the changelog, and commit the changes. Lastly, it will tag the commit.

scripts/version_bump.py

Documentation

Documentation is built with mkdocs using the Read the Docs theme.

Commands

  • mkdocs new [dir-name] - Create a new project.
  • mkdocs serve - Start the live-reloading docs server.
  • mkdocs build - Build the documentation site.
  • mkdocs -h - Print help message and exit.

Other useful commands:

  • mkdocs serve -a 0.0.0.0:8000 - Serve with external access to the site. (Useful in ExCL to view using foxyproxy.)

Hatch Commands

View environment

hatch env show docs

Build documentation

hatch run docs:build

Serve documentation

hatch run docs:serve

or

hatch run docs:serve -a 0.0.0.0:8000

About

Base ChatHPC Application and Library

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Languages