When my robot smiles at me: Enabling human-robot rapport via real-time head gesture mimicry

被引:0
|
作者
Laurel D. Riek
Philip C. Paul
Peter Robinson
机构
[1] University of Cambridge,Computer Laboratory
[2] University of Cambridge,Department of Engineering
来源
Journal on Multimodal User Interfaces | 2010年 / 3卷
关键词
Affective computing; Empathy; Facial expressions; Human-robot interaction; Social robotics;
D O I
暂无
中图分类号
学科分类号
摘要
People use imitation to encourage each other during conversation. We have conducted an experiment to investigate how imitation by a robot affect people’s perceptions of their conversation with it. The robot operated in one of three ways: full head gesture mimicking, partial head gesture mimicking (nodding), and non-mimicking (blinking). Participants rated how satisfied they were with the interaction. We hypothesized that participants in the full head gesture condition will rate their interaction the most positively, followed by the partial and non-mimicking conditions. We also performed gesture analysis to see if any differences existed between groups, and did find that men made significantly more gestures than women while interacting with the robot. Finally, we interviewed participants to try to ascertain additional insight into their feelings of rapport with the robot, which revealed a number of valuable insights.
引用
收藏
页码:99 / 108
页数:9
相关论文
共 50 条
  • [21] A real-time human-robot collision safety evaluation method for collaborative robot
    Shin, Heonseop
    Kim, Sanghoon
    Seo, Kwang
    Rhim, Sungsoo
    2019 THIRD IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING (IRC 2019), 2019, : 509 - 513
  • [22] Real-time human motion analysis for human-robot interaction
    Molina-Tanco, L
    Bandera, JP
    Marfil, R
    Sandoval, F
    2005 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2005, : 1808 - 1813
  • [23] Real-Time Recognition of Human Postures for Human-Robot Interaction
    Zafar, Zuhair
    Venugopal, Rahul
    Berns, Karsten
    ACHI 2018: THE ELEVENTH INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTER-HUMAN INTERACTIONS, 2018, : 114 - 119
  • [24] Real-Time Hand Gesture Recognition for Human Robot Interaction
    Correa, Mauricio
    Ruiz-del-Solar, Javier
    Verschae, Rodrigo
    Lee-Ferny, Jong
    Castillo, Nelson
    ROBOCUP 2009: ROBOT SOCCER WORLD CUP XIII, 2010, 5949 : 46 - 57
  • [25] Human-Robot Interaction in Real Time
    Jaluvka, Michal
    Volna, Eva
    INTERNATIONAL CONFERENCE OF COMPUTATIONAL METHODS IN SCIENCES AND ENGINEERING 2018 (ICCMSE-2018), 2018, 2040
  • [26] Human-robot interaction in Industry 4.0 based on an Internet of Things real-time gesture control system
    Roda-Sanchez, Luis
    Olivares, Teresa
    Garrido-Hidalgo, Celia
    Luis de la Vara, Jose
    Fernandez-Caballero, Antonio
    INTEGRATED COMPUTER-AIDED ENGINEERING, 2021, 28 (02) : 159 - 175
  • [27] Speech to Head Gesture Mapping in Multimodal Human-Robot Interaction
    Aly, Amir
    Tapus, Adriana
    SERVICE ORIENTATION IN HOLONIC AND MULTI-AGENT MANUFACTURING CONTROL, 2012, 402 : 183 - 196
  • [28] eEVA as a Real-Time Multimodal Agent Human-Robot Interface
    Pena, P.
    Polceanu, M.
    Lisetti, C.
    Visser, U.
    ROBOT WORLD CUP XXII, ROBOCUP 2018, 2019, 11374 : 262 - 274
  • [29] A real-time genetic algorithm in human-robot musical improvisation
    Weinberg, Gil
    Godfrey, Mark
    Rae, Alex
    Rhoads, John
    COMPUTER MUSIC MODELING AND RETRIEVAL: SENSE OF SOUNDS, 2008, 4969 : 351 - 359
  • [30] A Human-Robot Interaction for a Mecanum Wheeled Mobile Robot with Real-Time 3D Two-Hand Gesture Recognition
    Luo, Xueling
    Amighetti, Andrea
    Zhang, Dan
    2019 3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, AUTOMATION AND CONTROL TECHNOLOGIES (AIACT 2019), 2019, 1267