When my robot smiles at me: Enabling human-robot rapport via real-time head gesture mimicry

被引:0
|
作者
Laurel D. Riek
Philip C. Paul
Peter Robinson
机构
[1] University of Cambridge,Computer Laboratory
[2] University of Cambridge,Department of Engineering
来源
Journal on Multimodal User Interfaces | 2010年 / 3卷
关键词
Affective computing; Empathy; Facial expressions; Human-robot interaction; Social robotics;
D O I
暂无
中图分类号
学科分类号
摘要
People use imitation to encourage each other during conversation. We have conducted an experiment to investigate how imitation by a robot affect people’s perceptions of their conversation with it. The robot operated in one of three ways: full head gesture mimicking, partial head gesture mimicking (nodding), and non-mimicking (blinking). Participants rated how satisfied they were with the interaction. We hypothesized that participants in the full head gesture condition will rate their interaction the most positively, followed by the partial and non-mimicking conditions. We also performed gesture analysis to see if any differences existed between groups, and did find that men made significantly more gestures than women while interacting with the robot. Finally, we interviewed participants to try to ascertain additional insight into their feelings of rapport with the robot, which revealed a number of valuable insights.
引用
收藏
页码:99 / 108
页数:9
相关论文
共 50 条
  • [1] When my robot smiles at me Enabling human-robot rapport via real-time head gesture mimicry
    Riek, Laurel D.
    Paul, Philip C.
    Robinson, Peter
    JOURNAL ON MULTIMODAL USER INTERFACES, 2010, 3 (1-2) : 99 - 108
  • [2] Real-Time Face and Gesture Analysis for Human-Robot Interaction
    Wallhoff, Frank
    Rehrl, Tobias
    Mayer, Christoph
    Radig, Bernd
    REAL-TIME IMAGE AND VIDEO PROCESSING 2010, 2010, 7724
  • [3] Gesture Mimicry in Social Human-Robot Interaction
    Stolzenwald, Janis
    Bremner, Paul
    2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 430 - 436
  • [4] Real-time vision based gesture recognition for human-robot interaction
    Hong, Seok-ju
    Setiawan, Nurul Arif
    Lee, Chil-woo
    KNOWLEDGE-BASED INTELLIGENT INFORMATION AND ENGINEERING SYSTEMS: KES 2007 - WIRN 2007, PT I, PROCEEDINGS, 2007, 4692 : 493 - +
  • [5] Real-time person tracking and pointing gesture recognition for human-robot interaction
    Nickel, K
    Stiefelhagen, R
    COMPUTER VISION IN HUMAN-COMPUTER INTERACTION, PROCEEDINGS, 2004, 3058 : 28 - 38
  • [6] A Novel Real-Time Gesture Recognition Algorithm for Human-Robot Interaction on the UAV
    Chen, Bo
    Hua, Chunsheng
    Han, Jianda
    He, Yuqing
    COMPUTER VISION SYSTEMS, ICVS 2017, 2017, 10528 : 518 - 526
  • [7] Real-time safety for human-robot interaction
    Kulic, D
    Croft, EA
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2006, 54 (01) : 1 - 12
  • [8] Real-time safety for human-robot interaction
    Kulic, D
    Croft, EA
    2005 12TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS, 2005, : 719 - 724
  • [9] An Integrated Real-Time Hand Gesture Recognition Framework for Human-Robot Interaction in Agriculture
    Moysiadis, Vasileios
    Katikaridis, Dimitrios
    Benos, Lefteris
    Busato, Patrizia
    Anagnostis, Athanasios
    Kateris, Dimitrios
    Pearson, Simon
    Bochtis, Dionysis
    APPLIED SCIENCES-BASEL, 2022, 12 (16):
  • [10] Adaptive Real-Time Gesture Recognition in a Dynamic Scenario for Human-Robot Collaborative Applications
    Scoccia, Cecilia
    Menchi, Giacomo
    Ciccarelli, Marianna
    Forlini, Matteo
    Papetti, Alessandra
    ADVANCES IN ITALIAN MECHANISM SCIENCE, IFTOMM ITALY 2022, 2022, 122 : 637 - 644