Ruijie Zhu*,
Yanzhe Liang*,
Hanzhi Chang,
Jiacheng Deng,
Jiahao Lu,
Wenfei Yang,
Tianzhu Zhang,
Yongdong Zhang
*Equal Contribution.
University of Science and Technology of China
NeurIPS 2024
[Announcement] We have applied to the PCs of NeurIPS 2024 and ICLR 2025 to investigate a case related to this paper. In order to preserve our evidence, the relevant code and results will only be sent to the investigation team. After the investigation results are announced, we will open source the code for public use.
The overall architecture of MotionGS. It can be viewed as two data streams: (1) The 2D data stream utilizes the optical flow decoupling module to obtain the motion flow as the 2D motion prior; (2) The 3D data stream involves the deformation and transformation of Gaussians to render the image for the next frame. During training, we alternately optimize 3DGS and camera poses through the camera pose refinement module.
- Release the video demo, watch it in Youtube 🚀
- Release the MotionGS code
If you find our work useful in your research, please consider citing:
@article{zhu2024motiongs,
title={MotionGS: Exploring Explicit Motion Guidance for Deformable 3D Gaussian Splatting},
author={Zhu, Ruijie and Liang, Yanzhe and Chang, Hanzhi and Deng, Jiacheng and Lu, Jiahao and Yang, Wenfei and Zhang, Tianzhu and Zhang, Yongdong},
journal={arXiv preprint arXiv:2410.07707},
year={2024}
}