4D Gaussian Splatting for Real-Time Dynamic Scene Rendering
Efficiently represents dynamic scenes by capturing Gaussian motion and shape changes across time using a Gaussian deformation field network.
Introduces a novel approach called 4D Gaussian Splatting (4D-GS) for efficiently representing dynamic scenes. The proposed method aims to capture both Gaussian motion and shape changes across time by utilizing an efficient Gaussian deformation field network. This network consists of a temporal-spatial structure encoder and a multi-head Gaussian deformation decoder, which transforms a set of canonical 3D Gaussians into new positions with new shapes for each timestamp. Unlike previous methods that model motions of each Gaussian separately, the spatial-temporal structure encoder in 4D-GS connects different adjacent 3D Gaussians to predict more accurate motions and shape deformation. The deformed 3D Gaussians are then directly splatted for rendering the corresponding timestamp image. The framework achieves real-time rendering on dynamic scenes, demonstrating potential for editing and tracking in 4D scenes.
The proposed 4D-GS method makes significant contributions in several key areas. Firstly, it introduces an efficient 4D Gaussian splatting framework with a Gaussian deformation field that effectively models both Gaussian motion and shape changes across time. Additionally, a multi-resolution encoding method is proposed to connect nearby 3D Gaussians and build rich 3D Gaussian features using an efficient spatial-temporal structure encoder. The method achieves real-time rendering on dynamic scenes, with impressive frame rates and resolutions for both synthetic and real datasets, while maintaining comparable or superior performance compared to previous state-of-the-art methods.
Comments
None