Data-Driven Haptic Perception for Robot-Assisted Dressing

被引:0
|
作者
Kapusta, Ariel [1 ]
Yu, Wenhao [2 ]
Bhattacharjee, Tapomayukh [1 ]
Liu, C. Karen [2 ]
Turk, Greg [2 ]
Kemp, Charles C. [1 ]
机构
[1] Georgia Inst Technol, Healthcare Robot Lab, Atlanta, GA 30332 USA
[2] Georgia Inst Technol, Sch Interact Comp, Atlanta, GA 30332 USA
来源
2016 25TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN) | 2016年
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dressing is an important activity of daily living (ADL) with which many people require assistance due to impairments. Robots have the potential to provide dressing assistance, but physical interactions between clothing and the human body can be complex and difficult to visually observe. We provide evidence that data-driven haptic perception can be used to infer relationships between clothing and the human body during robot-assisted dressing. We conducted a carefully controlled experiment with 12 human participants during which a robot pulled a hospital gown along the length of each person's forearm 30 times. This representative task resulted in one of the following three outcomes: the hand missed the opening to the sleeve; the hand or forearm became caught on the sleeve; or the full forearm successfully entered the sleeve. We found that hidden Markov models (HMMs) using only forces measured at the robot's end effector classified these outcomes with high accuracy. The HMMs' performance generalized well to participants (98.61% accuracy) and velocities (98.61% accuracy) outside of the training data. They also performed well when we limited the force applied by the robot (95.8% accuracy with a 2N threshold), and could predict the outcome early in the process. Despite the lightweight hospital gown, HMMs that used forces in the direction of gravity substantially outperformed those that did not. The best performing HMMs used forces in the direction of motion and the direction of gravity.
引用
收藏
页码:451 / 458
页数:8
相关论文
共 50 条
  • [31] Robot-assisted Microsurgical Forceps with Haptic Feedback for Transoral Laser Microsurgery
    Deshpande, Nikhil
    Chauhan, Manish
    Pacchierotti, Claudio
    Prattichizzo, Domenico
    Caldwell, Darwin G.
    Mattos, Leonardo S.
    2016 38TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2016, : 5156 - 5159
  • [32] Robot-Assisted Haptic Rendering for Nail Hammering: A Representative of IADL Tasks
    Zhang, Changqi
    Wang, Cui
    Li, Ping
    Liu, Yudong
    Chen, Yi-Feng
    Zhang, Mingming
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2024, 21 (03) : 4028 - 4041
  • [33] Personalized Robot-assisted Dressing using User Modeling in Latent Spaces
    Zhang, Fan
    Cully, Antoine
    Demiris, Yiannis
    2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 3603 - 3610
  • [34] Haptic Feedback and Dynamic Active Constraints for Robot-Assisted Endovascular Catheterization
    Dagnino, G.
    Liu, J.
    Abdelaziz, M. E. M. K.
    Chi, W.
    Riga, C.
    Yang, G-Z.
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 1770 - 1775
  • [35] Visual-Tactile Learning of Garment Unfolding for Robot-Assisted Dressing
    Zhang, Fan
    Demiris, Yiannis
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (09) : 5512 - 5519
  • [36] Personalized collaborative plans for robot-assisted dressing via optimization and simulation
    Ariel Kapusta
    Zackory Erickson
    Henry M. Clever
    Wenhao Yu
    C. Karen Liu
    Greg Turk
    Charles C. Kemp
    Autonomous Robots, 2019, 43 : 2183 - 2207
  • [37] Personalized collaborative plans for robot-assisted dressing via optimization and simulation
    Kapusta, Ariel
    Erickson, Zackory
    Clever, Henry M.
    Yu, Wenhao
    Liu, C. Karen
    Turk, Greg
    Kemp, Charles C.
    AUTONOMOUS ROBOTS, 2019, 43 (08) : 2183 - 2207
  • [38] Exploiting Symmetry in Human Robot-Assisted Dressing Using Reinforcement Learning
    Ildefonso, Pedro
    Remedios, Pedro
    Silva, Rui
    Vasco, Miguel
    Melo, Francisco S.
    Paiva, Ana
    Veloso, Manuela
    PROGRESS IN ARTIFICIAL INTELLIGENCE (EPIA 2021), 2021, 12981 : 405 - 417
  • [39] Automatic deformation transfer for data-driven haptic rendering
    Farag, Sara
    Abdelrahman, Wael
    Creighton, Douglas
    Nahavandi, Saeid
    2013 11TH IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS (INDIN), 2013, : 264 - 269
  • [40] Computationally Efficient Techniques for Data-Driven Haptic Rendering
    Hoever, Raphael
    Di Luca, Massimiliano
    Szekely, Gabor
    Harders, Matthias
    WORLD HAPTICS 2009: THIRD JOINT EUROHAPTICS CONFERENCE AND SYMPOSIUM ON HAPTIC INTERFACES FOR VIRTUAL ENVIRONMENT AND TELEOPERATOR SYSTEMS, PROCEEDINGS, 2009, : 39 - +