SplattingAvatar: Realistic Real-Time Human Avatars with Mesh-Embedded Gaussian Splatting
Hybrid 3D representation of realistic human avatars utilizing Gaussian Splatting on a triangle mesh for efficient rendering on different devices.
The paper presents a novel method called SplattingAvatar for realistic real-time human avatar rendering using mesh-embedded Gaussian splatting. The method is evaluated on head avatars and full-body avatars using datasets from various sources. For head avatars, the method takes input images, masks, camera parameters, and tracked FLAME meshes. The method is trained and tested on generated meshes from NHA, which have more geometry details. The results show that the method outperforms existing methods in terms of average photometric errors. The method also demonstrates generalizability to novel poses through qualitative analysis. For full-body avatars, the method is compared to InstantAvatar and Anim-NeRF on the PeopleSnapshot dataset. The results show significant improvements in pixel-wise quality with low photometric errors. The method is also shown to be friendly to thin structures like accessories on the wrist.
In terms of implementation details, the training process involves setting specific parameters and following the original implementation of 3D Gaussian Splatting. The method's embeddings-based motion control of the Gaussians leads to smooth rendering results. The paper also includes an ablation study on walking on triangles, showing that disabling this mechanism leads to Gaussians sticking and piling up on triangle boundaries, causing artifacts when animated by novel poses. Additionally, the paper provides a dataset for head avatars, including 10 subjects from publicly available datasets, and shows high-quality rendering capability with high fidelity details, especially in the eyes, hair, and glasses.
Comments
None