Supporting Human-Robot Interaction by Projected Augmented Reality and a Brain Interface

被引:1
|
作者
De Pace, Francesco [1 ]
Manuri, Federico [2 ]
Bosco, Matteo [1 ]
Sanna, Andrea [2 ]
Kaufmann, Hannes [1 ]
机构
[1] TU Wien, Virtual & Augmented Real Grp, A-1040 Vienna, Austria
[2] Politecn Torino, Dept Control & Comp Engn, I-10129 Turin, Italy
关键词
Assistive robotics; augmented reality (AR); brain interface; NextMind; severe motor impairment; steady-state visual evoked potential (SSVEP);
D O I
10.1109/THMS.2024.3414208
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article presents a brain-computer interface (BCI) coupled with an augmented reality (AR) system to support human-robot interaction in controlling a robotic arm for pick-and-place tasks. BCIs can process steady-state visual evoked potentials (SSVEPs), which are signals generated through visual stimuli. The visual stimuli may be conveyed to the user with AR systems, expanding the range of possible applications. The proposed approach leverages the capabilities of the NextMind BCI to enable users to select objects in the range of the robotic arm. By displaying a visual anchor associated with each object in the scene with projected AR, the NextMind device can detect when users focus their eyesight on one of them, thus triggering the pick-up action of the robotic arm. The proposed system has been designed considering the needs and limitations of mobility-impaired people to support them when controlling a robotic arm for pick-and-place tasks. Two different approaches for positioning the visual anchors are proposed and analyzed. Experimental tests involving users show that both approaches are highly appreciated. The system performances are extremely robust, thus allowing the users to select objects in an easy, fast, and reliable way.
引用
收藏
页码:599 / 608
页数:10
相关论文
共 50 条
  • [41] See-Through and Spatial Augmented Reality - A Novel Framework for Human-Robot Interaction
    Dinh Quang Huy
    Vietcheslav, I.
    Lee, Seet Gim Gerald
    2017 3RD INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND ROBOTICS (ICCAR), 2017, : 719 - 726
  • [42] Understanding Human-Robot Interaction in Virtual Reality
    Liu, Oliver
    Rakita, Daniel
    Mutlu, Bilge
    Gleicher, Michael
    2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 751 - 757
  • [43] Human-Robot Interaction using Mixed Reality
    Gallala, Abir
    Hichri, Bassem
    Plapper, Peter
    INTERNATIONAL CONFERENCE ON ELECTRICAL, COMPUTER AND ENERGY TECHNOLOGIES (ICECET 2021), 2021, : 1387 - 1392
  • [44] A Simulator for Human-Robot Interaction in Virtual Reality
    Murnane, Mark
    Higgins, Padraig
    Saraf, Monali
    Ferraro, Francis
    Matuszek, Cynthia
    Engel, Don
    2021 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2021), 2021, : 470 - 471
  • [45] Learning Visualization Policies of Augmented Reality for Human-Robot Collaboration
    Chandan, Kishan
    Albertson, Jack
    Zhang, Shiqi
    CONFERENCE ON ROBOT LEARNING, VOL 205, 2022, 205 : 1233 - 1243
  • [46] Sharing skills: Using augmented reality for human-robot collaboration
    Giesler, B
    Steinhaus, P
    Walther, M
    Dillmann, R
    STEREOSCOPIC DISPLAYS AND VIRTUAL REALITY SYSTEMS XI, 2004, 5291 : 446 - 453
  • [47] Human-robot contactless collaboration with mixed reality interface
    Khatib, Maram
    Al Khudir, Khaled
    De Luca, Alessandro
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2021, 67
  • [48] Mediating Human-Robot Interactions with Virtual, Augmented, and Mixed Reality
    Szafir, Daniel
    VIRTUAL, AUGMENTED AND MIXED REALITY: APPLICATIONS AND CASE STUDIES, VAMR 2019, PT II, 2019, 11575 : 124 - 149
  • [49] A Glasses Tracking Interface for Human-Robot Interaction
    Kim, Heon-Hui
    Park, Kwang-Hyun
    2014 11TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2014, : 344 - 349
  • [50] A Gesture Based Interface for Human-Robot Interaction
    Stefan Waldherr
    Roseli Romero
    Sebastian Thrun
    Autonomous Robots, 2000, 9 : 151 - 173