A closed-loop brain-computer interface with augmented reality feedback for industrial human-robot collaboration

被引:19
|
作者
Ji, Zhenrui [1 ,2 ]
Liu, Quan [1 ,2 ]
Xu, Wenjun [1 ,2 ]
Yao, Bitao [2 ,3 ]
Liu, Jiayi [1 ,2 ]
Zhou, Zude [1 ,2 ]
机构
[1] Wuhan Univ Technol, Sch Informat Engn, Wuhan 430070, Peoples R China
[2] Wuhan Univ Technol, Hubei Key Lab Broadband Wireless Commun & Sensor, Wuhan 430070, Peoples R China
[3] Wuhan Univ Technol, Sch Mech & Elect Engn, Wuhan 430070, Peoples R China
基金
中国国家自然科学基金;
关键词
Industrial human-robot collaboration; Brain-computer interface; Augmented reality; Interactive robotic path planning; TECHNOLOGY; ARTIFACTS; REMOVAL; EEG;
D O I
10.1007/s00170-021-07937-z
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Industrial human-robot collaboration (HRC) aims to combine human intelligence and robotic capability to achieve higher productiveness. In industrial HRC, the communication between humans and robots is essential to enhance the understanding of the intent of each other to make a more fluent collaboration. Brain-computer interface (BCI) is a technology that could record the user's brain activity that can be translated into interaction messages (e.g., control commands) to the outside world, which can build a direct and efficient communication channel between human and robot. However, due to lacking information feedback mechanisms, it is challenging for BCI to control robots with a high degree of freedom with a limited number of classifiable mental states. To address this problem, this paper proposes a closed-loop BCI with contextual visual feedback by an augmented reality (AR) headset. In such BCI, the electroencephalogram (EEG) patterns from the multiple voluntary eye blinks are considered the input and its online detection algorithm is proposed whose average accuracy can reach 94.31%. Moreover, an AR-enable information feedback interface is designed to achieve an interactive robotic path planning. A case study of an industrial HRC assembly task is also developed to show that the proposed closed-up BCI could shorten the time of user input in human-robot interaction.
引用
收藏
页码:3083 / 3098
页数:16
相关论文
共 50 条
  • [21] Augmented Reality Brain-Computer Interface with Spatial Awareness
    Sugino, Masato
    Mori, Fumina
    Tanaka, Mai
    Kotani, Kiyoshi
    Jimbo, Yasuhiko
    IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2022, 17 (12) : 1820 - 1822
  • [22] Wearable Brain-Computer Interface Instrumentation for Robot-Based Rehabilitation by Augmented Reality
    Arpaia, Pasquale
    Duraccio, Luigi
    Moccaldi, Nicola
    Rossi, Silvia
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2020, 69 (09) : 6362 - 6371
  • [23] Augmented Reality for Human-Robot Collaboration and Cooperation in Industrial Applications: A Systematic Literature Review
    Costa, Gabriel de Moura
    Petry, Marcelo Roberto
    Moreira, Antonio Paulo
    SENSORS, 2022, 22 (07)
  • [24] Closed-Loop Hybrid Gaze Brain-Machine Interface Based Robotic Arm Control with Augmented Reality Feedback
    Zeng, Hong
    Wang, Yanxin
    Wu, Changcheng
    Song, Aiguo
    Liu, Jia
    Ji, Peng
    Xu, Baoguo
    Zhu, Lifeng
    Li, Huijun
    Wen, Pengcheng
    FRONTIERS IN NEUROROBOTICS, 2017, 11
  • [25] Electroencephalogram-based adaptive closed-loop brain-computer interface in neurorehabilitation: a review
    Jin, Wenjie
    Zhu, Xinxin
    Qian, Lifeng
    Wu, Cunshu
    Yang, Fan
    Zhan, Daowei
    Kang, Zhaoyin
    Luo, Kaitao
    Meng, Dianhuai
    Xu, Guangxu
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2024, 18
  • [26] Human-robot contactless collaboration with mixed reality interface
    Khatib, Maram
    Al Khudir, Khaled
    De Luca, Alessandro
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2021, 67
  • [27] Design and Implementation of a Behavioral Sequence Framework for Human-Robot Interaction Utilizing Brain-Computer Interface and Haptic Feedback
    Hazra, Sudip
    Whitaker, Shane
    Shiakolas, Panos S.
    Journal of Engineering and Science in Medical Diagnostics and Therapy, 2023, 6 (04)
  • [28] Learning Visualization Policies of Augmented Reality for Human-Robot Collaboration
    Chandan, Kishan
    Albertson, Jack
    Zhang, Shiqi
    CONFERENCE ON ROBOT LEARNING, VOL 205, 2022, 205 : 1233 - 1243
  • [29] Sharing skills: Using augmented reality for human-robot collaboration
    Giesler, B
    Steinhaus, P
    Walther, M
    Dillmann, R
    STEREOSCOPIC DISPLAYS AND VIRTUAL REALITY SYSTEMS XI, 2004, 5291 : 446 - 453
  • [30] Experimental Paradigm of Abnormalities Detection in Human-Robot Collaboration with Brain-Computer Interaction Techniques
    Yu, Xinjia
    Zhou, Yang
    Duan, Jian
    Shi, Tielin
    Cheng, Tao
    COMPUTATIONAL AND EXPERIMENTAL SIMULATIONS IN ENGINEERING, ICCES 2024-VOL 2, 2025, 173 : 719 - 726