T-3DGS: Removing Transient Objects for 3D Scene Reconstruction

Vadim Pryadilshchikov1, Alexander Markin1, Artem Komarichev1, Ruslan Rakhimov2, Peter Wonka3, Evgeny Burnaev1,4,
1Skoltech, Russia, 2Robotics Center, Russia 3KAUST, Saudi Arabia 4AIRI, Russia

Abstract

We propose a novel framework to remove transient objects from input videos for 3D scene reconstruction using Gaussian Splatting.

Our framework consists of the following steps. First, we propose an unsupervised training strategy for a classification network to distinguish between transient objects and static scene parts based on their different training behavior inside the 3D Gaussian Splatting reconstruction.

Then, we improve the boundary quality and stability of the detected transients by combining our results from the first step with an off-the-shelf segmentation method. We also propose a simple and effective strategy to track objects in the input video forward and backward in time.

overview

On-the-go dataset Visual Comparisons

Ours
SpotLessSplats
Ours
SpotLessSplats
Ours
SpotLessSplats
Ours
SpotLessSplats

Robust Mask Propagation

Visual Comparisons of TMR

Without TMR
With TMR
Without TMR
With TMR

BibTeX


        @misc{pryadilshchikov2024t3dgsremovingtransientobjects,
              title={T-3DGS: Removing Transient Objects for 3D Scene Reconstruction}, 
              author={Vadim Pryadilshchikov and Alexander Markin and Artem Komarichev and Ruslan Rakhimov and Peter Wonka and Evgeny Burnaev},
              year={2024},
              eprint={2412.00155},
              archivePrefix={arXiv},
              primaryClass={cs.CV},
              url={https://arxiv.org/abs/2412.00155}, 
        }