TL;DR: We propose MoBGS, a novel deblurring dynamic 3D Gaussian Splatting (3DGS) framework capable of reconstructing sharp and high-quality novel spatio-temporal views from blurry monocular videos in an end-to-end manner. MoBGS introduces a novel Blur-adaptive Latent Camera Estimation (BLCE) method for effective latent camera trajectory estimation, improving global camera motion deblurring. In addition, we propose a physically-inspired Latent Camera-induced Exposure Estimation (LCEE) method to ensure consistent deblurring of both global camera and local object motion.
Overview of MoBGS. Our MoBGS estimates latent camera poses for each blurry frame using our Blur-adaptive Latent Camera Estimation (BLCE) method. Then, leveraging these latent camera poses, it estimates the corresponding exposure time via our Latent Camera-induced Exposure Estimation (LCEE) method, ensuring a physically consistent blur modeling of local moving objects.
@article{bui2025mobgs,
title={MoBGS: Motion Deblurring Dynamic 3D Gaussian Splatting for Blurry Monocular Video},
author={Bui, Minh-Quan Viet and Park, Jongmin and Bello, Juan Luis Gonzalez and Moon, Jaeho and Oh, Jihyong and Kim, Munchurl},
journal={arXiv preprint arXiv:2504.15122},
year={2025}}