Space, Speech, and Gesture in Human-Robot Interaction

被引:0
|
作者
Mead, Ross [1 ]
机构
[1] Univ So Calif, Interact Lab, Los Angeles, CA 90089 USA
关键词
Human-robot interaction; proxemics; speech; gesture; multimodal;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To enable natural and productive situated human-robot interaction, a robot must both understand and control proxemics, the social use of space, in order to employ communication mechanisms analogous to those used by humans: social speech and gesture production and recognition. My research focuses on answering these questions: How do social (auditory and visual) and environmental (noisy and occluding) stimuli influence spatially situated communication between humans and robots, and how should a robot dynamically adjust its communication mechanisms to maximize human perceptions of its social signals in the presence of extrinsic and intrinsic sensory interference?
引用
收藏
页码:333 / 336
页数:4
相关论文
共 50 条
  • [41] Face tracking and hand gesture recognition for human-robot interaction
    Brèthes, L
    Menezes, P
    Lerasle, F
    Hayet, J
    2004 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1- 5, PROCEEDINGS, 2004, : 1901 - 1906
  • [42] Teaching a Robot to Draw: Hand Gesture Demonstration Based on Human-robot Interaction
    Yang, Limei
    Li, Zhihao
    Lei, Qujiang
    Xu, Jie
    Deng, Yunfu
    Zhong, Yuxin
    TWELFTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2019), 2020, 11433
  • [43] A Review of NASA Human-Robot Interaction in Space
    Kimberly Hambuchen
    Jessica Marquez
    Terrence Fong
    Current Robotics Reports, 2021, 2 (3): : 265 - 272
  • [44] Intelligent Speech Control System for Human-Robot Interaction
    Liu, Xiaomei
    Ge, Shuzhi Sam
    Jiang, Rui
    Goh, Cher-Hiang
    PROCEEDINGS OF THE 35TH CHINESE CONTROL CONFERENCE 2016, 2016, : 6154 - 6159
  • [45] Object recognition through human-robot interaction by speech
    Kurnia, R
    Hossain, A
    Nakamura, A
    Kuno, Y
    RO-MAN 2004: 13TH IEEE INTERNATIONAL WORKSHOP ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, PROCEEDINGS, 2004, : 619 - 624
  • [46] Paralinguistic Cues in Speech to Adapt Robot Behavior in Human-Robot Interaction
    Ashok, Ashita
    Pawlak, Jakub
    Paplu, Sarwar
    Zafar, Zuhair
    Berns, Karsten
    2022 9TH IEEE RAS/EMBS INTERNATIONAL CONFERENCE ON BIOMEDICAL ROBOTICS AND BIOMECHATRONICS (BIOROB 2022), 2022,
  • [47] Pointing Gesture Detection for Human-Robot Communication in Informationally Structured Space
    Obo, Takenori
    Kawabata, Ryosuke
    Kubota, Naoyuki
    2017 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2017, : 3148 - 3152
  • [48] Fusion of Gesture and Speech for Increased Accuracy in Human Robot Interaction
    Baranwal, Neha
    Singh, Avinash Kumar
    Hellstrom, Thomas
    2019 24TH INTERNATIONAL CONFERENCE ON METHODS AND MODELS IN AUTOMATION AND ROBOTICS (MMAR), 2019, : 139 - 144
  • [49] 3D Pointing Gesture Recognition for Human-Robot Interaction
    Lai, Yuhui
    Wang, Chen
    Li, Yanan
    Ge, Shuzhi Sam
    Huang, Deqing
    PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 4959 - 4964
  • [50] Human-Robot Interaction Through Gesture-Free Spoken Dialogue
    Vladimir Kulyukin
    Autonomous Robots, 2004, 16 : 239 - 257