Supervised learning of gesture-action associations for human-robot collaboration

被引:2
|
作者
Shukla, Dadhichi [1 ]
Erkent, Oezguer [1 ]
Piater, Justus [1 ]
机构
[1] Univ Innsbruck, Intelligent & Interact Syst, Innsbruck, Austria
关键词
D O I
10.1109/FG.2017.97
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As human-robot collaboration methodologies develop robots need to adapt fast learning methods in domestic scenarios. The paper presents a novel approach to learn associations between the human hand gestures and the robot's manipulation actions. The role of the robot is to operate as an assistant to the user. In this context we propose a supervised learning framework to explore the gesture-action space for human-robot collaboration scenario. The framework enables the robot to learn the gesture-action associations on the fly while performing the task with the user; an example of zero-shot learning. We discuss the effect of an accurate gesture detection in performing the task. The accuracy of the gesture detection system directly accounts for the amount of effort put by the user and the number of actions performed by the robot.
引用
收藏
页码:778 / 783
页数:6
相关论文
共 50 条
  • [21] Learning Semantics of Gestural Instructions for Human-Robot Collaboration
    Shukla, Dadhichi
    Erkent, Ozgur
    Piater, Justus
    FRONTIERS IN NEUROROBOTICS, 2018, 12
  • [22] Transparent Interaction Based Learning for Human-Robot Collaboration
    Bagheri, Elahe
    de Winter, Joris
    Vanderborght, Bram
    FRONTIERS IN ROBOTICS AND AI, 2022, 9
  • [23] Gesture Learning Based on A Topological Approach for Human-Robot Interaction
    Obo, Takenori
    Takizawa, Kazuma
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [24] Machine Learning in Human-Robot Collaboration: Bridging the Gap
    Matuszek, Cynthia
    Soh, Harold
    Gombolay, Matthew
    Gopalan, Nakul
    Simmons, Reid
    Nikoladis, Stefanos
    PROCEEDINGS OF THE 2022 17TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI '22), 2022, : 1275 - 1277
  • [25] Context-aware hand gesture interaction for human-robot collaboration in construction
    Wang, Xin
    Veeramani, Dharmaraj
    Dai, Fei
    Zhu, Zhenhua
    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2024, 39 (22) : 3489 - 3504
  • [26] Vision-Based Hand Gesture Recognition for Human-Robot Collaboration: A Survey
    Xia, Zanwu
    Lei, Qujiang
    Yang, Yang
    Zhang, Hongda
    He, Yue
    Wang, Weijun
    Huang, Minghui
    CONFERENCE PROCEEDINGS OF 2019 5TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND ROBOTICS (ICCAR), 2019, : 198 - 205
  • [27] Digital twins for hand gesture-guided human-robot collaboration systems
    Liu, Ao
    Zhang, Yifan
    Yao, Yuan
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART B-JOURNAL OF ENGINEERING MANUFACTURE, 2024, 238 (14) : 2060 - 2074
  • [28] DYNAMIC GESTURE DESIGN AND RECOGNITION FOR HUMAN-ROBOT COLLABORATION WITH CONVOLUTIONAL NEURAL NETWORKS
    Chen, Haodong
    Tao, Wenjin
    Leu, Ming C.
    Yin, Zhaozheng
    PROCEEDINGS OF THE 2020 INTERNATIONAL SYMPOSIUM ON FLEXIBLE AUTOMATION (ISFA2020), 2020,
  • [29] A Survey of Robot Learning Strategies for Human-Robot Collaboration in Industrial Settings
    Mukherjee, Debasmita
    Gupta, Kashish
    Chang, Li Hsin
    Najjaran, Homayoun
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2022, 73
  • [30] Human-robot collaboration: A survey
    Bauer, Andrea
    Wollherr, Dirk
    Buss, Martin
    INTERNATIONAL JOURNAL OF HUMANOID ROBOTICS, 2008, 5 (01) : 47 - 66