Brain-Computer Interface Integrated With Augmented Reality for Human-Robot Interaction

被引:19
|
作者
Fang, Bin [1 ]
Ding, Wenlong [2 ]
Sun, Fuchun [1 ]
Shan, Jianhua [2 ]
Wang, Xiaojia [3 ]
Wang, Chengyin [2 ]
Zhang, Xinyu [4 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[2] Anhui Univ Technol, Dept Mech Engn, Maanshan 243002, Anhui, Peoples R China
[3] Clemson Univ, Dept Elect & Comp Engn, Clemson, SC 29634 USA
[4] Tsinghua Univ, State Key Lab Automot Safety & Energy, Beijing 100084, Peoples R China
基金
中国国家自然科学基金;
关键词
Augmented reality (AR); brain-computer interface (BCI) system; FB-tCNN; human-robot interaction; steady-state visual evoked potential (SSVEP); stimulation interface; visual information; COMMUNICATION;
D O I
10.1109/TCDS.2022.3194603
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain-computer interface (BCI) has been gradually used in human-robot interaction systems. Steady-state visual evoked potential (SSVEP) as a paradigm of electroencephalography (EEG) has attracted more attention in the BCI system research due to its stability and efficiency. However, an independent monitor is needed in the traditional SSVEP-BCI system to display stimulus targets, and the stimulus targets map fixedly to some preset commands. These limit the development of the SSVEP-BCI application system in complex and changeable scenarios. In this study, the SSVEP-BCI system integrated with augmented reality (AR) is proposed. Furthermore, a stimulation interface is made by merging the visual information of the objects with stimulus targets, which can update the mapping relationship between stimulus targets and objects automatically to adapt to the change of the objects in the workspace. During the online experiment of the AR-based SSVEP-BCI cue-guided task with the robotic arm, the success rate of grasping is 87.50 +/- 3.10% with the SSVEP-EEG data recognition time of 0.5 s based on FB-tCNN. The proposed AR-based SSVEP-BCI system enables the users to select intention targets more ecologically and can grasp more kinds of different objects with a limited number of stimulus targets, resulting in the potential to be used in complex and changeable scenarios.
引用
收藏
页码:1702 / 1711
页数:10
相关论文
共 50 条
  • [21] A Demonstration of the Taxonomy of Functional Augmented Reality for Human-Robot Interaction
    Phaijit, Ornnalin
    Obaid, Mohammad
    Sammut, Claude
    Johal, Wafa
    PROCEEDINGS OF THE 2022 17TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI '22), 2022, : 981 - 985
  • [22] Explainable Human-Robot Interaction for Imitation Learning in Augmented Reality
    Belardinelli, Anna
    Wang, Chao
    Gienger, Michael
    HUMAN-FRIENDLY ROBOTICS 2023, HFR 2023, 2024, 29 : 94 - 109
  • [23] Experimental Paradigm of Abnormalities Detection in Human-Robot Collaboration with Brain-Computer Interaction Techniques
    Yu, Xinjia
    Zhou, Yang
    Duan, Jian
    Shi, Tielin
    Cheng, Tao
    COMPUTATIONAL AND EXPERIMENTAL SIMULATIONS IN ENGINEERING, ICCES 2024-VOL 2, 2025, 173 : 719 - 726
  • [24] An Augmented Reality Based Human-Robot Interaction Interface Using Kalman Filter Sensor Fusion
    Li, Chunxu
    Fahmy, Ashraf
    Sienz, Johann
    SENSORS, 2019, 19 (20)
  • [25] Brain-Computer Interface and Hand-Guiding Control in a Human-Robot Collaborative Assembly Task
    Dmytriyev, Yevheniy
    Insero, Federico
    Carnevale, Marco
    Giberti, Hermes
    MACHINES, 2022, 10 (08)
  • [26] Transparent Robot Behavior Using Augmented Reality in Close Human-Robot Interaction
    Bolano, Gabriele
    Juelg, Christian
    Roennau, Arne
    Dillmann, Ruediger
    2019 28TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2019,
  • [27] Human-Robot Cooperation via Brain Computer Interface
    Foresi, Gabriele
    Freddi, Alessandro
    Iarlori, Sabrina
    Monteriu, Andrea
    Ortenzi, Davide
    Pagnotta, Daniele Proietti
    2017 IEEE 7TH INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - BERLIN (ICCE-BERLIN), 2017, : 1 - 2
  • [28] Brain-Computer Interface in Virtual Reality
    Abbasi-Asl, Reza
    Keshavarzi, Mohammad
    Chan, Dorian Yao
    2019 9TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING (NER), 2019, : 1220 - 1224
  • [29] Mixed Reality as a Bidirectional Communication Interface for Human-Robot Interaction
    Rosen, Eric
    Whitney, David
    Fishman, Michael
    Ullman, Daniel
    Tellex, Stefanie
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 11431 - 11438
  • [30] A Brain-Computer Interface for Robot Navigation
    Nawroj, Ahsan I.
    Wang, Siyuan
    Yu, Yih-Choung
    Gabel, Lisa
    2012 38TH ANNUAL NORTHEAST BIOENGINEERING CONFERENCE (NEBEC), 2012, : 15 - +