Face mediated human–robot interaction for remote medical examination

被引:0
|
作者
Thilina D. Lalitharatne
Leone Costi
Ryman Hashem
Ilana Nisky
Rachael E. Jack
Thrishantha Nanayakkara
Fumiya Iida
机构
[1] University of Cambridge,Department of Engineering
[2] Imperial College London,Dyson School of Design Engineering
[3] Ben-Gurion University of the Negev,Department of Biomedical Engineering
[4] University of Glasgow,School of Psychology and Neuroscience
来源
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Realtime visual feedback from consequences of actions is useful for future safety-critical human–robot interaction applications such as remote physical examination of patients. Given multiple formats to present visual feedback, using face as feedback for mediating human–robot interaction in remote examination remains understudied. Here we describe a face mediated human–robot interaction approach for remote palpation. It builds upon a robodoctor–robopatient platform where user can palpate on the robopatient to remotely control the robodoctor to diagnose a patient. A tactile sensor array mounted on the end effector of the robodoctor measures the haptic response of the patient under diagnosis and transfers it to the robopatient to render pain facial expressions in response to palpation forces. We compare this approach against a direct presentation of tactile sensor data in a visual tactile map. As feedback, the former has the advantage of recruiting advanced human capabilities to decode expressions on a human face whereas the later has the advantage of being able to present details such as intensity and spatial information of palpation. In a user study, we compare these two approaches in a teleoperated palpation task to find the hard nodule embedded in the remote abdominal phantom. We show that the face mediated human–robot interaction approach leads to statistically significant improvements in localizing the hard nodule without compromising the nodule position estimation time. We highlight the inherent power of facial expressions as communicative signals to enhance the utility and effectiveness of human–robot interaction in remote medical examinations.
引用
收藏
相关论文
共 50 条
  • [41] Development of a Face Robot for Cranial Nerves Examination Training
    Wang, Chunbao
    Noh, Yohan
    Terunaga, Chihara
    Tokumoto, Mitsuhiro
    Okuyama, Isamu
    Yusuke, Matsuoka
    Ishii, Hiroyuki
    Takanishi, Atsuo
    Hatake, Kazuyuki
    Shoji, Satoru
    2012 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO 2012), 2012,
  • [42] Real Time Tracking System by Vision Simplifying the Interaction between Human and Robot for Remote Control of Mobile Robot
    Bonnin, P.
    Blazevic, P.
    Morillon, J.
    Fialaire, C.
    Benoist, J. S.
    RO-MAN 2009: THE 18TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, VOLS 1 AND 2, 2009, : 217 - +
  • [43] Adaptive Face Recognition for Low-Cost, Embedded Human-Robot Interaction
    Zhang, Yan
    Hornfeck, Kenneth
    Lee, Kiju
    INTELLIGENT AUTONOMOUS SYSTEMS 12, VOL 1, 2013, 193 : 863 - 872
  • [44] Detecting and tracking of 3D face pose for human-robot interaction
    Dornaika, Fadi
    Raducanu, Bogdan
    2008 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-9, 2008, : 1716 - +
  • [45] Facial interaction between animated 3D face robot and human beings
    Kobayashi, H
    Hara, F
    SMC '97 CONFERENCE PROCEEDINGS - 1997 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-5: CONFERENCE THEME: COMPUTATIONAL CYBERNETICS AND SIMULATION, 1997, : 3732 - 3737
  • [46] CASE STUDY ON HUMAN-ROBOT INTERACTION OF THE REMOTE-CONTROLLED SERVICE ROBOT FOR ELDERLY AND DISABLED CARE
    Chivarov, Nayden
    Chikurtev, Denis
    Chivarov, Stefan
    Pleva, Matus
    Ondas, Stanislav
    Juhar, Jozef
    Yovchev, Kaloyan
    COMPUTING AND INFORMATICS, 2019, 38 (05) : 1210 - 1236
  • [47] Human-robot interaction in autism: FACE, an android-based social therapy
    Pioggia, G.
    Sica, M. L.
    Ferro, M.
    Igliozzi, R.
    Muratori, F.
    Ahluwalia, A.
    De Rossi, D.
    2007 RO-MAN: 16TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, VOLS 1-3, 2007, : 604 - +
  • [48] Real-Time Coordination in Human-Robot Interaction Using Face and Voice
    Skantze, Gabriel
    AI MAGAZINE, 2016, 37 (04) : 19 - 31
  • [49] Improved Human Interaction in Telepresence Robot using Real-time Face Segmentation
    Gu, William
    Seet, Gerald
    Magnenat-Thalmann, Nadia
    2013 32ND CHINESE CONTROL CONFERENCE (CCC), 2013, : 5902 - 5907
  • [50] A culture of understanding: An examination of face-to-face and computer mediated environments
    Cohen, A
    INTERNATIONAL CONFERENCE ON THE LEARNING SCIENCES, 1996, 1996, : 14 - 21