Code for curating, processing, and analyzing the NIBS myelin mapping pilot dataset, developed by the Lifespan Informatics and Neuroimaging Center (LINC) at Penn. The study acquires multi-modal MRI data (T1w, T2w, MP2RAGE, ihMTRAGE, MESE, QSM, dMRI) across two sessions to derive and compare myelin-sensitive measures including g-ratio, T1w/T2w ratio, R1, ihMT saturation, T2*, and susceptibility-based metrics.
| Directory | Contents |
|---|---|
curation/ |
Numbered pipeline scripts (shell + Python) that convert raw DICOMs into a BIDS dataset |
processing/ |
Python processing pipelines for each imaging modality |
analysis/ |
Scripts and notebooks for correlation matrices, parcellation, brain masks, and figures |
data/ |
Tabular outputs (demographics, correlation matrices, missingness) safe to share on GitHub |
figures/ |
Manuscript figures generated by analysis scripts |
tests/ |
Pytest test suite |
ng_analysis/ |
ICC analysis code and figures |
Scripts in curation/ are run in order. Shell scripts (.sh) are designed for SLURM on CUBIC;
Python scripts (.py) can be run locally.
| Step | Script | Description |
|---|---|---|
| 00 | 00_download_data.sh |
Download data from the scanner for each subject |
| 01 | 01_unzip_dicom_zips.py |
Unzip top-level DICOM archives |
| 02 | 02_unzip_dicoms.py |
Unzip nested DICOM zips in the scitran layout |
| 03 | 03_run_heudiconv.sh |
Run heudiconv to convert DICOMs to BIDS (uses heuristic.py) |
| 03b | 03b_convert_mp2rage_phase.py |
Convert MP2RAGE RR phase DICOMs to NIfTI with dcm2niix |
| 03c | 03c_fix_mp2rage_phase.py |
Fix MP2RAGE phase filenames and remove bad conversions |
| 04 | 04_split_ihmt.py |
Split 4D ihMTRAGE NIfTIs into individual 3D BIDS volumes |
| 05 | 05_chmod.sh |
Set write permissions for JSON editing |
| 06 | 06_anonymize_acqtimes.py |
Round acquisition times and write session-level metadata |
| 07 | 07_clean_jsons.py |
Remove sensitive/unneeded fields from JSON sidecars |
| 08 | 08_fix_bids.py |
Post-heudiconv fixes (bvec/bval, phase units, MP2RAGE/MESE metadata) |
| 09 | 09_validate_bids.sh |
Run the BIDS validator |
| 10 | 10_initialize_datalad.sh |
Initialize a DataLad dataset and make the initial commit |
| 11 | 11_reface_anatomicals.sh |
Deface T1w, T2w, and MP2RAGE images |
| 12 | 12_rename_bad_mese.py |
Correct swapped PA/AP phase-encoding labels in MESE files |
In a perfect world, 03b_convert_mp2rage_phase.py, 03c_fix_mp2rage_phase.py, and 12_rename_bad_mese.py would be handled by heudiconv or in 08_fix_bids.py.
Each module in processing/ is a standalone script. All share a common utils.py with
load_config, run_command, get_filename, coregister_to_t1, and helper functions.
| Module | Description |
|---|---|
process_mp2rage.py |
B1 correction and R1 mapping via pymp2rage |
process_ihmt.py |
ihMTRAGE processing: motion correction, coregistration, ihMT/MTR maps |
process_mese.py |
Multi-echo spin-echo T2*/R2* fitting |
process_t1wt2w_ratio.py |
T1w/T2w myelin-weighted ratio maps |
process_qsm_prep.py |
QSM preparation (multi-echo GRE preprocessing) |
process_qsm_sepia.py |
QSM estimation via SEPIA (MATLAB/SPM) |
process_qsm_chisep.py |
QSM estimation via Chi-separation toolbox (MATLAB/SPM) |
process_qsm_post.py |
QSM post-processing and chi-separation |
process_g_ratio.py |
G-ratio computation from myelin and axon volume fractions |
process_g_ratio_scaling_factors.py |
Scaling factor calibration for g-ratio inputs |
generate_myelin_reports.py |
HTML reports with scalar map overlays |
rename_chisep_outputs.py |
Rename chi-separation output files to BIDS conventions |
| Script | Description |
|---|---|
parcellate_scalar_maps.py |
Extract parcellated values per subject/session |
generate_correlation_matrices.py |
Compute mean Fisher-z correlation matrices across subjects |
plot_correlation_matrices.py |
Basic heatmap of correlation matrices |
plot_correlation_matrices_clustered.py |
Hierarchically clustered correlation heatmaps |
plot_correlation_matrices_mukherjee.py |
Mukherjee-style correlation plots |
plot_correlation_matrix_diffs.py |
Difference maps between tissue-type correlation matrices |
plot_myelin_scalar_maps.py |
Mean scalar map brain overlays per session |
build_brain_mask.py |
Build a study-wide brain mask from multi-modal masks |
calculate_brain_mask_dice.py |
Dice overlap between modality masks and smriprep |
build_missingness_list.py |
Missingness matrix for acquired modalities |
Notebooks in analysis/ include plot_missingness.ipynb, simulate_correlations.ipynb,
and simulate_correlations2.ipynb.
Install the dependency set that matches your task:
conda env create -f environment_curation.yml
conda env create -f environment_processing.yml
conda env create -f environment_test.ymlActivate the environment you need to use:
conda activate curation
conda activate processing
conda activate testAll paths are configured in paths.yaml at the repository root.
Update project_root to match your local or cluster mount point:
project_root: /cbica/projects/nibsScripts load this file via config.load_config(), which resolves all
relative paths against project_root and returns a dictionary of absolute paths.
pip install -r requirements_test.txt
pytest/cbica/projects/nibs/
|- code/ # Local clone of the GitHub repository
|- figures/ # Any figures for the manuscript
|_ data/ # Tabular data that may be shared on GitHub
|- dset/ # BIDS-compliant dataset
|- derivatives/ # Derivative data
|_ sourcedata/ # Raw data from the scanner