BodyFusion: Real-time Capture of Human Motion and Surface Geometry Using a Single Depth Camera

被引:105
|
作者
Yu, Tao [1 ,2 ]
Guo, Kaiwen [2 ]
Xu, Feng [2 ]
Dong, Yuan [2 ]
Su, Zhaoqi [2 ]
Zhao, Jianhui [1 ]
Li, Jianguo [3 ]
Dai, Qionghai [2 ]
Liu, Yebin [2 ]
机构
[1] Beihang Univ, Beijing, Peoples R China
[2] Tsinghua Univ, Beijing, Peoples R China
[3] Intel Labs China, Beijing, Peoples R China
关键词
TRACKING;
D O I
10.1109/ICCV.2017.104
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose BodyFusion, a novel real-time geometry fusion method that can track and reconstruct non-rigid surface motion of a human performance using a single consumer-grade depth camera. To reduce the ambiguities of the non-rigid deformation parameterization on the surface graph nodes, we take advantage of the internal articulated motion prior for human performance and contribute a skeleton-embedded surface fusion (SSF) method. The key feature of our method is that it jointly solves for both the skeleton and graph-node deformations based on information of the attachments between the skeleton and the graph nodes. The attachments are also updated frame by frame based on the fused surface geometry and the computed deformations. Overall, our method enables increasingly denoised, detailed, and complete surface reconstruction as well as the updating of the skeleton and attachments as the temporal depth frames are fused. Experimental results show that our method exhibits substantially improved non-rigid motion fusion performance and tracking robustness compared with previous state-of-the-art fusion methods. We also contribute a dataset for the quantitative evaluation of fusion-based dynamic scene reconstruction algorithms using a single depth camera.
引用
收藏
页码:910 / 919
页数:10
相关论文
共 50 条
  • [41] Real-time control of 3D virtual human motion using a depth-sensing camera for agricultural machinery training
    Wang, Chengfeng
    Ma, Qin
    Zhu, Dehai
    Chen, Hong
    Yang, Zhoutuo
    MATHEMATICAL AND COMPUTER MODELLING, 2013, 58 (3-4) : 776 - 783
  • [42] REAL-TIME HUMAN DETECTION AND TRACKING IN COMPLEX ENVIRONMENTS USING SINGLE RGBD CAMERA
    Liu, Jun
    Liu, Ye
    Cui, Ying
    Chen, Yan Qiu
    2013 20TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP 2013), 2013, : 3088 - 3092
  • [43] Real-Time Human Motion Capture Driven by a Wireless Sensor Network
    Chen, Peng-zhan
    Li, Jie
    Luo, Man
    Zhu, Nian-hua
    INTERNATIONAL JOURNAL OF COMPUTER GAMES TECHNOLOGY, 2015, 2015
  • [44] Real-time Human Action Recognition From Motion Capture Data
    Vantigodi, Suraj
    Babu, R. Venkatesh
    2013 FOURTH NATIONAL CONFERENCE ON COMPUTER VISION, PATTERN RECOGNITION, IMAGE PROCESSING AND GRAPHICS (NCVPRIPG), 2013,
  • [45] Real-time displacement monitoring using camera video records with camera motion correction
    Yi, Zhuoran
    Cao, Miao
    Kito, Yuya
    Sato, Gota
    Zhang, Xuan
    Xie, Liyu
    Xue, Songtao
    MEASUREMENT, 2024, 229
  • [46] Real-time human action recognition based on depth motion maps
    Chen, Chen
    Liu, Kui
    Kehtarnavaz, Nasser
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2016, 12 (01) : 155 - 163
  • [47] Real-time human action recognition based on depth motion maps
    Chen Chen
    Kui Liu
    Nasser Kehtarnavaz
    Journal of Real-Time Image Processing, 2016, 12 : 155 - 163
  • [48] Real-time Detection of Wearable Camera Motion Using Optical Flow
    Younis, Ola
    Al-Nuaimy, Waleed
    Rowe, Fiona
    Alomari, Mohammad H.
    2018 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2018, : 238 - 243
  • [49] Real-time and markerless 3D human motion capture using multiple views
    Michoud, Brice
    Guillou, Erwan
    Bouakaz, Saieda
    HUMAN MOTION - UNDERSTANDING, MODELING, CAPTURE AND ANIMATION, PROCEEDINGS, 2007, 4814 : 88 - +
  • [50] Comparing Real-time Human Motion Capture System using Inertial Sensors with Microsoft Kinect
    Xiang, Chengkai
    Hsu, Hui-Huang
    Hwang, Wu-Yuin
    Ma, Jianhua
    2014 7TH INTERNATIONAL CONFERENCE ON UBI-MEDIA COMPUTING AND WORKSHOPS (UMEDIA), 2014, : 53 - 58