Capturing complex hand movements and object interactions using machine learning-powered stretchable smart textile gloves

被引:32
|
作者
Tashakori, Arvin [1 ,2 ]
Jiang, Zenan [2 ,3 ]
Servati, Amir [2 ]
Soltanian, Saeid [2 ]
Narayana, Harishkumar [2 ,3 ]
Le, Katherine [2 ,3 ]
Nakayama, Caroline [2 ]
Yang, Chieh-ling [4 ,5 ]
Wang, Z. Jane [1 ]
Eng, Janice J. [6 ,7 ]
Servati, Peyman [1 ,2 ]
机构
[1] Univ British Columbia, Dept Elect & Comp Engn, Flexible Elect & Energy Lab FEEL, Vancouver, BC, Canada
[2] Texavie Technol Inc, Vancouver, BC, Canada
[3] Univ British Columbia, Dept Mat Engn, Vancouver, BC, Canada
[4] Chang Gung Univ, Grad Inst Behav Sci, Coll Med, Dept Occupat Therapy, Taoyuan, Taiwan
[5] Chang Gung Mem Hosp, Dept Phys Med & Rehabil, Chiayi, Taiwan
[6] Univ British Columbia, Fac Med, Dept Phys Therapy, Vancouver, BC, Canada
[7] Vancouver Coastal Hlth Res Inst, Ctr Hip Hlth & Mobil, Vancouver, BC, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
TACTILE;
D O I
10.1038/s42256-023-00780-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accurate real-time tracking of dexterous hand movements has numerous applications in human-computer interaction, the metaverse, robotics and tele-health. Capturing realistic hand movements is challenging because of the large number of articulations and degrees of freedom. Here we report accurate and dynamic tracking of articulated hand and finger movements using stretchable, washable smart gloves with embedded helical sensor yarns and inertial measurement units. The sensor yarns have a high dynamic range, responding to strains as low as 0.005% and as high as 155%, and show stability during extensive use and washing cycles. We use multi-stage machine learning to report average joint-angle estimation root mean square errors of 1.21 degrees and 1.45 degrees for intra- and inter-participant cross-validation, respectively, matching the accuracy of costly motion-capture cameras without occlusion or field-of-view limitations. We report a data augmentation technique that enhances robustness to noise and variations of sensors. We demonstrate accurate tracking of dexterous hand movements during object interactions, opening new avenues of applications, including accurate typing on a mock paper keyboard, recognition of complex dynamic and static gestures adapted from American Sign Language, and object identification. Accurate real-time tracking of dexterous hand movements and interactions has applications in human-computer interaction, the metaverse, robotics and tele-health. Capturing realistic hand movements is challenging due to the large number of articulations and degrees of freedom. Tashakori and colleagues report accurate and dynamic tracking of articulated hand and finger movements using machine-learning powered stretchable, washable smart gloves.
引用
收藏
页码:106 / 118
页数:22
相关论文
共 2 条
  • [1] Capturing complex hand movements and object interactions using machine learning-powered stretchable smart textile gloves
    Arvin Tashakori
    Zenan Jiang
    Amir Servati
    Saeid Soltanian
    Harishkumar Narayana
    Katherine Le
    Caroline Nakayama
    Chieh-ling Yang
    Z. Jane Wang
    Janice J. Eng
    Peyman Servati
    Nature Machine Intelligence, 2024, 6 : 106 - 118
  • [2] Capturing forceful interaction with deformable objects using a deep learning-powered stretchable tactile array
    Jiang, Chunpeng
    Xu, Wenqiang
    Li, Yutong
    Yu, Zhenjun
    Wang, Longchun
    Hu, Xiaotong
    Xie, Zhengyi
    Liu, Qingkun
    Yang, Bin
    Wang, Xiaolin
    Du, Wenxin
    Tang, Tutian
    Zheng, Dongzhe
    Yao, Siqiong
    Lu, Cewu
    Liu, Jingquan
    NATURE COMMUNICATIONS, 2024, 15 (01)