Sharing skills: Using augmented reality for human-robot collaboration

被引:2
|
作者
Giesler, B [1 ]
Steinhaus, P [1 ]
Walther, M [1 ]
Dillmann, R [1 ]
机构
[1] Univ Karlsruhe, Chair Ind Appl Informat & Microsyst, D-76128 Karlsruhe, Germany
关键词
D O I
10.1117/12.526689
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Both stationary 'industrial' and autonomous mobile robots nowadays pervade many workplaces, but human-friendly interaction with them is still very much an experimental subject. One of the reasons for this is that computer and robotic systems are very bad at performing certain tasks well and robust. A prime example is classification of sensor readings: Which part of a 3D depth image is the cup, which the saucer, which the table? These are tasks that humans excel at. To alleviate this problem, we propose a team approach, wherein the robot records sensor data and uses an Augmented-Reality (AR) system to present the data to the user directly in the 3D environment. The user can then perform classification decisions directly on the data by pointing, gestures and speech commands. After the classification has been performed by the user, the robot takes the classified data and matches it to its environment model. As a demonstration of this approach, we present an initial system for creating objects on-the-fly in the environment model. A rotating laser scanner is used to capture a 3D snapshot of the environment. This snapshot is presented to the user as an overlay over his view of the scene. The user classifies unknown objects by pointing at them. The system segments the snapshot according to the user's indications and presents the results of segmentation back to the user, who can then inspect, correct and enhance them interactively. After a satisfying result has been reached, the laserscanner can take more snapshots from other angles and use the previous segmentation hints to construct a 3D model of the object.
引用
收藏
页码:446 / 453
页数:8
相关论文
共 50 条
  • [31] BARI: An Affordable Brain-Augmented Reality Interface to Support Human-Robot Collaboration in Assembly Tasks
    Sanna, Andrea
    Manuri, Federico
    Fiorenza, Jacopo
    De Pace, Francesco
    INFORMATION, 2022, 13 (10)
  • [32] Augmented Reality and Human-Robot Collaboration Framework for Percutaneous Nephrolithotomy: System Design, Implementation, and Performance Metrics
    Fu, Junling
    Pecorella, Matteo
    Iovene, Elisa
    Palumbo, Maria Chiara
    Rota, Alberto
    Redaelli, Alberto
    Ferrigno, Giancarlo
    De Momi, Elena
    IEEE ROBOTICS & AUTOMATION MAGAZINE, 2024, 31 (03) : 25 - 37
  • [33] Proxemic-aware Augmented Reality For Human-Robot Interaction
    Liu, Jingyang
    Mao, Hongyu
    Bard, Joshua
    2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 1323 - 1330
  • [34] A Demonstration of the Taxonomy of Functional Augmented Reality for Human-Robot Interaction
    Phaijit, Ornnalin
    Obaid, Mohammad
    Sammut, Claude
    Johal, Wafa
    PROCEEDINGS OF THE 2022 17TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI '22), 2022, : 981 - 985
  • [35] Explainable Human-Robot Interaction for Imitation Learning in Augmented Reality
    Belardinelli, Anna
    Wang, Chao
    Gienger, Michael
    HUMAN-FRIENDLY ROBOTICS 2023, HFR 2023, 2024, 29 : 94 - 109
  • [36] An Augmented Reality Interface for Safer Human-Robot Interaction in Manufacturing
    Rybalskii, Igor
    Kruusamae, Karl
    Singh, Arun Kumar
    Schlund, Sebastian
    IFAC PAPERSONLINE, 2024, 58 (19): : 581 - 585
  • [37] An Augmented Reality Interface for Human-Robot Interaction in Unconstrained Environments
    Chacko, Sonia Mary
    Kapila, Vikram
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 3222 - 3228
  • [38] Mediating Human-Robot Interactions with Virtual, Augmented, and Mixed Reality
    Szafir, Daniel
    VIRTUAL, AUGMENTED AND MIXED REALITY: APPLICATIONS AND CASE STUDIES, VAMR 2019, PT II, 2019, 11575 : 124 - 149
  • [39] Virtual Reality Study of Human Adaptability in Industrial Human-Robot Collaboration
    Fratczak, Piotr
    Goh, Yee Mey
    Kinnell, Peter
    Justham, Laura
    Soltoggio, Andrea
    PROCEEDINGS OF THE 2020 IEEE INTERNATIONAL CONFERENCE ON HUMAN-MACHINE SYSTEMS (ICHMS), 2020, : 94 - 99
  • [40] Implementation of virtual reality systems for simulation of human-robot collaboration
    Rueckert, Patrick
    Wohlfromm, Laura
    Tracht, Kirsten
    PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE IN THROUGH-LIFE ENGINEERING SERVICES, 2018, 19 : 164 - 170