We propose a novel framework to remove transient objects from input videos for 3D scene reconstruction using Gaussian Splatting.
Our framework consists of the following steps. First, we propose an unsupervised training strategy for a classification network to distinguish between transient objects and static scene parts based on their different training behavior inside the 3D Gaussian Splatting reconstruction.
Then, we improve the boundary quality and stability of the detected transients by combining our results from the first step with an off-the-shelf segmentation method. We also propose a simple and effective strategy to track objects in the input video forward and backward in time.
@misc{pryadilshchikov2024t3dgsremovingtransientobjects,
title={T-3DGS: Removing Transient Objects for 3D Scene Reconstruction},
author={Vadim Pryadilshchikov and Alexander Markin and Artem Komarichev and Ruslan Rakhimov and Peter Wonka and Evgeny Burnaev},
year={2024},
eprint={2412.00155},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2412.00155},
}
|