Human-Robot Interaction Using Learning from Demonstrations and a Wearable Glove with Multiple Sensors

被引:3
|
作者
Singh, Rajmeet [1 ]
Mozaffari, Saeed [1 ]
Akhshik, Masoud [1 ]
Ahamed, Mohammed Jalal [1 ]
Rondeau-Gagne, Simon [2 ]
Alirezaee, Shahpour [1 ]
机构
[1] Univ Windsor, Mech Automot & Mat Engn Dept, Windsor, ON N9B 3P4, Canada
[2] Univ Windsor, Dept Chem & Biochem, Windsor, ON N9B 3P4, Canada
关键词
robotic grasping; human-robot interaction; inertia; pressure; flexi sensors; wearable devices; learning from demonstration;
D O I
10.3390/s23249780
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Human-robot interaction is of the utmost importance as it enables seamless collaboration and communication between humans and robots, leading to enhanced productivity and efficiency. It involves gathering data from humans, transmitting the data to a robot for execution, and providing feedback to the human. To perform complex tasks, such as robotic grasping and manipulation, which require both human intelligence and robotic capabilities, effective interaction modes are required. To address this issue, we use a wearable glove to collect relevant data from a human demonstrator for improved human-robot interaction. Accelerometer, pressure, and flexi sensors were embedded in the wearable glove to measure motion and force information for handling objects of different sizes, materials, and conditions. A machine learning algorithm is proposed to recognize grasp orientation and position, based on the multi-sensor fusion method.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Skill Learning for Human-Robot Interaction Using Wearable Device
    Bin Fang
    Xiang Wei
    Fuchun Sun
    Haiming Huang
    Yuanlong Yu
    Huaping Liu
    Tsinghua Science and Technology, 2019, 24 (06) : 654 - 662
  • [2] Skill Learning for Human-Robot Interaction Using Wearable Device
    Fang, Bin
    Wei, Xiang
    Sun, Fuchun
    Huang, Haiming
    Yu, Yuanlong
    Liu, Huaping
    TSINGHUA SCIENCE AND TECHNOLOGY, 2019, 24 (06) : 654 - 662
  • [3] Unified Learning from Demonstrations, Corrections, and Preferences during Physical Human-Robot Interaction
    Mehta, Shaunak A.
    Losey, Dylan P.
    ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION, 2024, 13 (03)
  • [4] Learning Sequential Human-Robot Interaction Tasks from Demonstrations: The Role of Temporal Reasoning
    Carpio, Estuardo
    Clark-Turner, Madison
    Begum, Momotaz
    2019 28TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2019,
  • [5] Learning from Demonstrations in Human-Robot Collaborative Scenarios: A Survey
    Daniel Sosa-Ceron, Arturo
    Gustavo Gonzalez-Hernandez, Hugo
    Antonio Reyes-Avendano, Jorge
    ROBOTICS, 2022, 11 (06)
  • [6] Wearable Sensors for Human-Robot Walking Together
    Moschetti, Alessandra
    Cavallo, Filippo
    Esposito, Dario
    Penders, Jacques
    Di Nuovo, Alessandro
    ROBOTICS, 2019, 8 (02)
  • [7] Towards the Quantification of Human-Robot Imitation Using Wearable Inertial Sensors
    Xochicale, Miguel P.
    Baber, Chris
    Oussalah, Mourad
    COMPANION OF THE 2017 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'17), 2017, : 327 - 328
  • [8] Learning cooperation from human-robot interaction
    Nicolescu, MN
    Mataric, MJ
    DISTRIBUTED AUTONOMOUS ROBOTIC SYSTEMS, 2000, : 477 - 478
  • [9] Effective Human-Robot Collaboration Through Wearable Sensors
    Al-Yacoub, Ali
    Buerkle, Achim
    Flanagan, Myles
    Ferreira, Pedro
    Hubbard, Ella-Mae
    Lohse, Niels
    2020 25TH IEEE INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION (ETFA), 2020, : 651 - 658
  • [10] The Effect of Multiple Robot Interaction on Human-Robot Interaction
    Yang, Jeong-Yean
    Kwon, Dong-Soo
    2012 9TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAL), 2012, : 30 - 33