Skip to content

The official implementation of the NeurIPS 2024 paper: DoGaussian: Distributed-Oriented Gaussian Splatting for Large-Scale 3D Reconstruction Via Gaussian Consensus

License

Notifications You must be signed in to change notification settings

AIBluefisher/DOGS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

49 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

DOGS

DOGS: Distributed-Oriented Gaussian Splatting for Large-Scale 3D Reconstruction Via Gaussian Consensus

[🌐 Project Page | arXiv] (NeurIPS 2024)


πŸ› οΈ Installation

Install the conda environment of ZeroGS.

conda create -n dogs python=3.9
conda activate dogs
cd DOGS/scripts
./scripts/env/install.sh

🀷 Introduction

Our method accelerates the training of 3DGS by 6+ times when evaluated on large-scale scenes while concurrently achieving state-of-the-art rendering quality.

πŸš€ TODO & Roadmap

  • βœ”οΈ Release evaluation code πŸŽ‰
  • πŸ”² Release pre-trained models on Mill19, UrbanScene3D, and MatrixCity
  • βœ”οΈ Release web-viewer
  • βœ”οΈ Release training code
    • βœ”οΈ Gaussian Splatting trainer πŸŽ‰
    • βœ”οΈ Scaffold-GS trainer πŸŽ‰
    • βœ”οΈ Support Taming-3DGS πŸŽ‰
    • πŸ”² ADMM Gaussian Splatting trainer
  • πŸ”² Test on street-view scenes
  • πŸ”² Support distributed training of Scaffold-GS and Octree-GS

πŸ“‹ Train & Test

βš—οΈ Preprocess Mill-19 dataset and UrbanScene3D dataset

Follow the instruction of Mill 19 and UrbanScene 3D in Mega-NeRF to download the Mill-19 dataset and the UrbanScene3D dataset. We provide scripts to convert the Mega-NeRF camera poses to the format of COLMAP.

cd DOGS
# Replace the `data_dir` at Line 202 and Line 205 by your own.
python -m scripts.preprocess.meganerf_to_colmap

βš—οΈ Preprocess MatrixCity dataset

We also provide a script to convert the camera poses of the MatrixCity dataset into the COLMAP format:

cd DOGS
# Replace the 'data_dir_list' in Line 31 by your own; 
# also set the scenes you want to convert at Line 23-27.
python -m scripts.preprocess.matrix_city_to_colmap

βš—οΈ Preprocess Large-Scale dataset

We first run the provided script to pre-process a large-scale scene into several blocks:

cd scripts/preprocess
./preprocess_large_scale_data.sh 0 urban3d gaussian_splatting
Visualize scene splitting

Please check and compile my modification of COLMAP. After installation, launch COLMAP's GUI. I extended the original model files of COLMAP with an additional cluster.txt file, where each line of the file follows the format: [image_id, cluster_id]. Once COLMAP's GUI finds this file, it will render each image with its color corresponding to its cluster ID. Below are some examples of scene splitting:

sci-art_blocks_2x4_cameras

campus_blocks_2x4_cameras

βš—οΈ Preprocess your own dataset

Additionally, we provide scripts to preprocess your own dataset which inputs a .MOV video and outputs the camera poses in COLMAP format:

VIDEO_DIR=x
INPUT_FILE=xx
OUTPUT_FOLDER=xxx
FRAMERATE=3
VOC_TREE_PATH=xxxx
cd scripts/preprocess
# (1) Convert video to image sequence
./video_to_sequence.sh $VIDEO_DIR $INPUT_FILE $OUTPUT_FOLDER $FRAMERATE
# (2) Compute camera poses with COLMAP
./colmap_mapping.sh $VIDEO_DIR $VIDEO_DIR $VOC_TREE_PATH 100 0

βŒ› Train 3D Gaussian Splatting

Train 3DGS on a single GPU

cd scripts/train
DATASET=mipnerf360
./train_nvs.sh 0 $EXP_SUFFIX $DATASET gaussian_splatting

We provide configuration files for training the blender, llff, matrix_city, mipnerf360, tanks_and_temples and urban3d datasets. We can also train our own dataset by setting the correct dataset path and scenes in config/gaussian_splatting/custom.yaml.

Train 3DGS on multiple GPUs

We are still brewing the distributed training code due to refactoring and testing. You can try the admm branch for a quick test: git checkout admm. Note since the code and CUDA rasterizer for training 3DGS are different to the code when finishing the experiments in the camera ready, it is suggested to evaluate the performance once the our pretrained models are released.

Here we provide scripts and an example to show how to run DOGS on three compute nodes with 9 GPUs in total (1 GPU on a master node and 4 GPUs each of two slave nodes).

Before running the program, we may need to modify the parameters in the provided scripts: (1) scripts/train/train_admm_master.sh:

  • set NUM_TOTAL_NODES to the correct total number of GPUs (In this example, we use 9 GPUs as described above)
  • set ETHERNET_INTERFACE to the ethernet interface of your computer(we can get the correct interface of your server by typing ifconfig in the terminal of a Linux machine)
  • set DATASET to the dataset you want to reconstruct
  • set the correct IP address of the master node --master_addr=xx.xx.xx.xx

(2) Modify the above mentioned parameters accordingly in scripts/train/train_admm_worker1.sh and scripts/train/train_admm_worker2.sh.

At first, in the terminal of the master node, we run:

cd scripts/train
./train_admm_master.sh $EXP_SUFFIX urban3d_admm

Then, we establish workers in the terminal for each of the two slave nodes:

On the slave node #1
cd scripts/train
./train_admm_worker1.sh $EXP_SUFFIX urban3d_admm
On the slave node #2
cd scripts/train
./train_admm_worker2.sh $EXP_SUFFIX urban3d_admm

πŸ“Š Evaluate 3D Gaussian Splatting

cd scripts/eval
./eval_nvs.sh 0 $EXP_SUFFIX urban3d gaussian_splatting

After that, we can have a cup of coffee and wait the master node connects with the slave nodes and finishes the training.

✏️ Cite

If you find this project useful for your research, please consider citing our paper:

@inproceedings{yuchen2024dogaussian,
    title={DOGS: Distributed-Oriented Gaussian Splatting for Large-Scale 3D Reconstruction Via Gaussian Consensus},
    author={Yu Chen, Gim Hee Lee},
    booktitle={arXiv},
    year={2024},
}

πŸ™Œ Acknowledgements

This work is built upon 3d-gaussian-splatting. We sincerely thank the authors for releasing their code. Yu Chen is also partially supported by a Google PhD Fellowship for finishing this project.

πŸͺͺ License

Copyright Β© 2024, Chen Yu. All rights reserved. Please see the license file for terms.

About

The official implementation of the NeurIPS 2024 paper: DoGaussian: Distributed-Oriented Gaussian Splatting for Large-Scale 3D Reconstruction Via Gaussian Consensus

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published