A Survey of Multimodal Perception Methods for Human-Robot Interaction in Social Environments

被引:1
|
作者
Duncan, John A. [1 ]
Alambeigi, Farshid [1 ]
Pryor, Mitchell W. [1 ]
机构
[1] Univ Texas Austin, Austin, TX 78712 USA
关键词
Human-robot interaction; multimodal perception; situated interaction; social robotics; human social environments; USER ENGAGEMENT; SOUND SOURCES; LOCALIZATION; RECOGNITION; DESIGN; FUSION; SYSTEM; FRAMEWORK; NETWORK; DATASET;
D O I
10.1145/3657030
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Human-robot interaction (HRI) in human social environments (HSEs) poses unique challenges for robot perception systems, which must combine asynchronous, heterogeneous data streams in real time. Multimodal perception systems are well-suited for HRI in HSEs and can provide more rich, robust interaction for robots operating among humans. In this article, we provide an overview of multimodal perception systems being used in HSEs, which is intended to be an introduction to the topic and summary of relevant trends, techniques, resources, challenges, and terminology. We surveyed 15 peer-reviewed robotics and HRI publications over the past 10+ years, providing details about the data acquisition, processing, and fusion techniques used in 65 multimodal perception systems across various HRI domains. Our survey provides information about hardware, software, datasets, and methods currently available for HRI perception research, as well as how these perception systems are being applied in HSEs. Based on the survey, we summarize trends, challenges, and limitations of multimodal human perception systems for robots, then identify resources for researchers and developers and propose future research areas to advance the field.
引用
收藏
页数:50
相关论文
共 50 条
  • [1] A Survey on Perception Methods for Human-Robot Interaction in Social Robots
    Yan, Haibin
    Ang, Marcelo H., Jr.
    Poo, Aun Neow
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2014, 6 (01) : 85 - 119
  • [2] A Survey on Perception Methods for Human-Robot Interaction in Social Robots
    Yan, H. (eyanhaibin@gmail.com), 1600, Kluwer Academic Publishers (06):
  • [3] Human-Robot Perception in Industrial Environments: A Survey
    Bonci, Andrea
    Cen Cheng, Pangcheng David
    Indri, Marina
    Nabissi, Giacomo
    Sibona, Fiorella
    SENSORS, 2021, 21 (05) : 1 - 29
  • [4] A Survey on Perception Methods for Human–Robot Interaction in Social Robots
    Haibin Yan
    Marcelo H. Ang
    Aun Neow Poo
    International Journal of Social Robotics, 2014, 6 (1) : 85 - 119
  • [5] Human-Robot Interaction and Collaborative Manipulation with Multimodal Perception Interface for Human
    Huang, Shouren
    Ishikawa, Masatoshi
    Yamakawa, Yuji
    PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON HUMAN-AGENT INTERACTION (HAI'19), 2019, : 289 - 291
  • [6] Scene Perception and Recognition in Industrial Environments for Human-Robot Interaction
    Somani, Nikhil
    Dean-Leon, Emmanuel
    Cai, Caixia
    Knoll, Alois
    ADVANCES IN VISUAL COMPUTING, ISVC 2013, PT I, 2013, 8033 : 373 - 384
  • [7] Active Perception based on Energy Minimization in Multimodal Human-robot Interaction
    Horii, Takato
    Nagai, Yukie
    Asada, Minoru
    PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON HUMAN AGENT INTERACTION (HAI'17), 2017, : 103 - 110
  • [8] Multimodal Interaction for Human-Robot Teams
    Burke, Dustin
    Schurr, Nathan
    Ayers, Jeanine
    Rousseau, Jeff
    Fertitta, John
    Carlin, Alan
    Dumond, Danielle
    UNMANNED SYSTEMS TECHNOLOGY XV, 2013, 8741
  • [9] Multimodal Human-Robot Interaction for Human-Centric Smart Manufacturing: A Survey
    Wang, Tian
    Zheng, Pai
    Li, Shufei
    Wang, Lihui
    ADVANCED INTELLIGENT SYSTEMS, 2024, 6 (03)
  • [10] Human-robot interaction: A survey
    Brigham Young University, Provo, UT 84602, United States
    不详
    Found. Trends Human-Comput. Interact., 2007, 3 (203-275):