Face mediated human–robot interaction for remote medical examination

被引:0
|
作者
Thilina D. Lalitharatne
Leone Costi
Ryman Hashem
Ilana Nisky
Rachael E. Jack
Thrishantha Nanayakkara
Fumiya Iida
机构
[1] University of Cambridge,Department of Engineering
[2] Imperial College London,Dyson School of Design Engineering
[3] Ben-Gurion University of the Negev,Department of Biomedical Engineering
[4] University of Glasgow,School of Psychology and Neuroscience
来源
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Realtime visual feedback from consequences of actions is useful for future safety-critical human–robot interaction applications such as remote physical examination of patients. Given multiple formats to present visual feedback, using face as feedback for mediating human–robot interaction in remote examination remains understudied. Here we describe a face mediated human–robot interaction approach for remote palpation. It builds upon a robodoctor–robopatient platform where user can palpate on the robopatient to remotely control the robodoctor to diagnose a patient. A tactile sensor array mounted on the end effector of the robodoctor measures the haptic response of the patient under diagnosis and transfers it to the robopatient to render pain facial expressions in response to palpation forces. We compare this approach against a direct presentation of tactile sensor data in a visual tactile map. As feedback, the former has the advantage of recruiting advanced human capabilities to decode expressions on a human face whereas the later has the advantage of being able to present details such as intensity and spatial information of palpation. In a user study, we compare these two approaches in a teleoperated palpation task to find the hard nodule embedded in the remote abdominal phantom. We show that the face mediated human–robot interaction approach leads to statistically significant improvements in localizing the hard nodule without compromising the nodule position estimation time. We highlight the inherent power of facial expressions as communicative signals to enhance the utility and effectiveness of human–robot interaction in remote medical examinations.
引用
收藏
相关论文
共 50 条
  • [21] Real-time Face Tracking for Human-Robot Interaction
    Putro, Muhamad Dwisnanto
    Jo, Kang-Hyun
    2018 INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY ROBOTICS (ICT-ROBOT), 2018,
  • [22] Real-Time Face Recognition for Human-Robot Interaction
    Cruz, Claudia
    Enrique Sucar, L.
    Morales, Eduardo F.
    2008 8TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE & GESTURE RECOGNITION (FG 2008), VOLS 1 AND 2, 2008, : 679 - 684
  • [23] An Efficient ORB based Face Recognition framework for Human Robot Interaction
    Vinay, A.
    Cholin, Ajaykumar S.
    Bhat, Aditya D.
    Murthy, K. N. Balasubramanya
    Natarajan, S.
    INTERNATIONAL CONFERENCE ON ROBOTICS AND SMART MANUFACTURING (ROSMA2018), 2018, 133 : 913 - 923
  • [24] Enhancing Human-Robot Interaction by a Robot Face with Facial Expressions and Synchronized Lip Movements
    Seib, Viktor
    Giesen, Julian
    Gruentjens, Dominik
    Paulus, Dietrich
    WSCG 2013, COMMUNICATION PAPERS PROCEEDINGS, 2013, : 70 - 77
  • [25] Formal Verification for Human-Robot Interaction in Medical Environments
    Choi, Benjamin J.
    Park, Juyoun
    Park, Chung Hyuk
    HRI '21: COMPANION OF THE 2021 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2021, : 181 - 185
  • [26] New methods of human-robot interaction in medical practice
    Garcia-Aracil, Nicolas
    Zollo, Loredana
    Casals, Alicia
    Sabater-Navarro, JoseMaria
    COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2014, 116 (02) : 49 - 51
  • [27] On multi-human multi-robot remote interaction: a study of transparency, inter-human communication, and information loss in remote interaction
    Patel, Jayam
    Sonar, Prajankya
    Pinciroli, Carlo
    SWARM INTELLIGENCE, 2022, 16 (02) : 107 - 142
  • [28] On multi-human multi-robot remote interaction: a study of transparency, inter-human communication, and information loss in remote interaction
    Jayam Patel
    Prajankya Sonar
    Carlo Pinciroli
    Swarm Intelligence, 2022, 16 : 107 - 142
  • [29] Remote interaction in robot supported playing
    Mina, S.
    Kronreif, G.
    Prazak, B.
    WMSCI 2005: 9th World Multi-Conference on Systemics, Cybernetics and Informatics, Vol 2, 2005, : 359 - 362
  • [30] A Comparative Human-Robot Interaction Study between Face-Display and an Advanced Social Robot
    Salem, Ahmed
    Sumi, Kaoru
    Proceedings - 2024 IEEE 48th Annual Computers, Software, and Applications Conference, COMPSAC 2024, 2024, : 628 - 633