Skip to content

girish1511/UniDAC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

UniDAC: Universal Metric Depth Estimation for Any Camera

Girish Chandar Ganesan1 . Yuliang Guo2 · Liu Ren2 . Xiaoming Liu1,3

1Michigan State University   2Bosch Research North America   3University of North Carolina at Chapel Hill

Paper PDF Project Page

CVPR 2026

animated

News

  • Training code for UniDAC.
  • Demo code for images with unknown camera parameters.
  • Demo code for easy setup and usage.
  • 2026-03-13: Release of UniDAC checkpoint trained on moderately sized datasets.
  • 2026-03-13: Testing and evaluation pipeline for zero-shot metric depth estimation on perspective, fisheye, and 360-degree datasets.
  • 2026-03-13: Data preparation and curation scripts.
  • 2026-02-20: UniDAC accepted by CVPR 2026!

Pipeline

pipeline

Performance

UniDAC outperforms all prior metric depth estimation methods trained with perspective images on both indoor and outdoor datasets and sets the SoTA in zero-shot cross-camera generalization and universal domain robustness. UniDAC outperforms UniK3D, even though the latter has been trained on large FoV images and has a much larger training set, demonstrating the robustness of UniDAC. Matterport3D is present in the training set of UniK3D and thus we omit its results.

Methods Dataset
Size
ScanNet++ Pano3D-GV2 KITTI-360 Matterport3D
δ₁ ↑Abs.Rel ↓ δ₁ ↑Abs.Rel ↓ δ₁ ↑Abs.Rel ↓ δ₁ ↑Abs.Rel ↓
UniK3D 8M 0.651 0.253 0.785 0.170 0.817 0.244 - -
Metric3Dv2 16M 0.5360.223 0.4040.307 0.7160.200 0.4380.292
UniDepth 3M 0.3640.497 0.2470.789 0.4810.294 0.2580.765
DACU 0.8M 0.6580.233 0.6840.203 0.7080.186 0.6620.215
UniDAC 1.45M 0.9180.097 0.7680.161 0.8360.141 0.7450.175

Installation

Clone the Repository

git clone https://github.com/girish1511/UniDAC
cd UniDAC

Conda Installation

Alternatively, this repository can be run from within Conda alone.

conda create -n unidac python=3.10.18 -y
conda activate unidac
pip install torch==2.7.0 torchvision==0.22.0 torchaudio==2.7.0 --index-url https://download.pytorch.org/whl/cu118

pip install -r requirements.txt
export PYTHONPATH="$PWD:$PYTHONPATH"

Data Preparation

The training set consist of 4 outdoor datasets and 3 indoor datasets. The testing set consists of two 360 datasets, two fisheye datasets and 4 perspective datasets.

Please refer to DATA.md for detailed datasets preparation.

Demo

We provide a simple ready-to-run demo script in the demo folder along with the required sample inputs in demo/input. demo/demo_unidac.py demonstrates the inference pipeline for diverse camera types and scenes, including ScanNet++(Indoor, Fisheye), Matterport3D(Indoor, 360) and KITTI360(Outdoor, Fisheye), using a unified model trained only on perspective images.

Download the checkpoint from and place in checkpoints/. You can then run the demo script by running the following command and the visualizations will be stored in demo/output:

bash demo.sh

Testing

Download the checkpoint from and place in checkpoints/.

Run the following to evaluate and reproduce the results presented in the paper:

bash eval.sh <domain> <dataset>

Different config files for evaluating the reported testing datasets are included in configs/test. Refer to the table below to set the <domain> and <dataset> arguments, which together select the corresponding configuration file for the dataset you wish to evaluate.

ScanNet++ Matterport3D Pano3D-GibsonV2 KITTI-360 KITTI NYU NuScenes iBims-1
<domain> indoor indoor indoor outdoor outdoor indoor outdoor indoor
<dataset> scannetpp gv2 scannetpp kitti360 kitti nyu nuscenes ibims

Acknowledgements

We thank the authors of the following awesome codebases:

License

This software is released under MIT license. You can view a license summary here.

Citation

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages