Decoding Engagement: The Role of Closeness Cues in Human-Robot Interactions

被引:0
|
作者
Loos, Kira [1 ]
Brandt, Mara [1 ]
Vollmer, Anna-Lisa [1 ]
机构
[1] Bielefeld Univ, Med Sch OWL, Bielefeld, Germany
关键词
Behavioral Signal Processing; Communication Initiation; Human-Robot Interaction; Intent Recognition; Non-verbal Cues Analysis; Social Robots; Social Sustainability; User Intent Prediction; Willingness to Interact; NONVERBAL-COMMUNICATION; GAZE;
D O I
10.1109/RO-MAN60168.2024.10731411
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work examines the critical role of nonverbal cues in initiating successful human-robot interactions (HRI), focusing on the development of an algorithm capable of decoding subtle cues that indicate a human's intention to interact with a robot. The foundation for the analysis is the concept of closeness which relates to the individual feeling of intimacy and comfort, and indicates the engagement potential in diverse settings and interactions. This paper introduces a novel algorithm designed to quantify a person's willingness to interact based on analyzing the displayed closeness on a series of images. The algorithm's efficacy is evaluated through an empirical study involving 20 participants, generating a substantial dataset of 130 video sequences. These sequences were analyzed to assess the algorithm's ability to predict communication initiation intentions, with findings suggesting a significant distinction between participants engaged in interaction with a robot and those instructed to ignore it. The results underscore the algorithm's potential in facilitating more nuanced and socially sustainable HRI, laying the groundwork for future advancements in the field. This research contributes to the growing body of work on social robotics, emphasizing the importance of integrating nonverbal cue analysis for enhancing robots' interactive capabilities and fostering more meaningful human-robot connections.
引用
收藏
页码:711 / 716
页数:6
相关论文
共 50 条
  • [31] Authoring and Verifying Human-Robot Interactions
    Porfirio, David
    Sauppe, Allison
    Albarghouthi, Aws
    Mutlu, Bilge
    UIST 2018: PROCEEDINGS OF THE 31ST ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, 2018, : 75 - 86
  • [32] Assessment of adaptive human-robot interactions
    Sekmen, Ali
    Challa, Prathima
    KNOWLEDGE-BASED SYSTEMS, 2013, 42 : 49 - 59
  • [33] Human-Robot Interactions in Investment Decisions
    Bianchi, Milo
    Briere, Marie
    MANAGEMENT SCIENCE, 2024,
  • [34] Natural Human-Robot Interaction Using Social Cues
    Romat, Hugo
    Williams, Mary-Anne
    Wang, Xun
    Johnston, Benjamin
    Bard, Henry
    ELEVENTH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN ROBOT INTERACTION (HRI'16), 2016, : 503 - 504
  • [35] Adaptive Spacing in Human-Robot Interactions
    Papadakis, Panagiotis
    Rives, Patrick
    Spalanzani, Anne
    2014 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2014), 2014, : 2627 - +
  • [36] A survey of Tactile Human-Robot Interactions
    Argall, Brenna D.
    Billard, Aude G.
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2010, 58 (10) : 1159 - 1176
  • [37] Animated simulation of human-robot interactions
    Luh, JYS
    Srioon, S
    PROCEEDINGS OF THE 3RD WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-5, 2000, : 1361 - 1366
  • [38] Safety Issues in Human-Robot Interactions
    Vasic, Milos
    Billard, Aude
    2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2013, : 197 - 204
  • [39] Interactions and motions in human-robot coordination
    Luh, JYS
    Hu, SY
    ICRA '99: IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, PROCEEDINGS, 1999, : 3171 - 3176
  • [40] Interactions and motions in human-robot coordination
    Clemson Univ, Clemson, United States
    Proc IEEE Int Conf Rob Autom, (3171-3176):