Human-Robot Interaction Using Learning from Demonstrations and a Wearable Glove with Multiple Sensors

被引:3
|
作者
Singh, Rajmeet [1 ]
Mozaffari, Saeed [1 ]
Akhshik, Masoud [1 ]
Ahamed, Mohammed Jalal [1 ]
Rondeau-Gagne, Simon [2 ]
Alirezaee, Shahpour [1 ]
机构
[1] Univ Windsor, Mech Automot & Mat Engn Dept, Windsor, ON N9B 3P4, Canada
[2] Univ Windsor, Dept Chem & Biochem, Windsor, ON N9B 3P4, Canada
关键词
robotic grasping; human-robot interaction; inertia; pressure; flexi sensors; wearable devices; learning from demonstration;
D O I
10.3390/s23249780
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Human-robot interaction is of the utmost importance as it enables seamless collaboration and communication between humans and robots, leading to enhanced productivity and efficiency. It involves gathering data from humans, transmitting the data to a robot for execution, and providing feedback to the human. To perform complex tasks, such as robotic grasping and manipulation, which require both human intelligence and robotic capabilities, effective interaction modes are required. To address this issue, we use a wearable glove to collect relevant data from a human demonstrator for improved human-robot interaction. Accelerometer, pressure, and flexi sensors were embedded in the wearable glove to measure motion and force information for handling objects of different sizes, materials, and conditions. A machine learning algorithm is proposed to recognize grasp orientation and position, based on the multi-sensor fusion method.
引用
收藏
页数:17
相关论文
共 50 条
  • [41] Efficient Model Learning from Joint-Action Demonstrations for Human-Robot Collaborative Tasks
    Nikolaidis, Stefanos
    Ramakrishnan, Ramya
    Gu, Keren
    Shah, Julie
    PROCEEDINGS OF THE 2015 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'15), 2015, : 189 - 196
  • [42] HUMAN-ROBOT: FROM INTERACTION TO RELATIONSHIP
    Grandgeorge, Marine
    Duhaut, Dominique
    FIELD ROBOTICS, 2012, : 339 - 346
  • [43] A survey of communicating robot learning during human-robot interaction
    Habibian, Soheil
    Valdivia, Antonio Alvarez
    Blumenschein, Laura H.
    Losey, Dylan P.
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2024,
  • [44] Visual Perception for Multiple Human-Robot Interaction From Motion Behavior
    Benli, Emrah
    Motai, Yuichi
    Rogers, John
    IEEE SYSTEMS JOURNAL, 2020, 14 (02): : 2937 - 2948
  • [45] Communication and knowledge sharing in human-robot interaction and learning from demonstration
    Koenig, Nathan
    Takayama, Leila
    Mataric, Maja
    NEURAL NETWORKS, 2010, 23 (8-9) : 1104 - 1112
  • [46] Ergodicity reveals assistance and learning from physical human-robot interaction
    Fitzsimons, Kathleen
    Acosta, Ana Maria
    Dewald, Julius P. A.
    Murphey, Todd D.
    SCIENCE ROBOTICS, 2019, 4 (29)
  • [47] Human-Robot Team Interaction Through Wearable Haptics for Cooperative Manipulation
    Music, Selma
    Salvietti, Gionata
    Dohman, Pablo Budde Gen
    Chinello, Francesco
    Prattichizzo, Domenico
    Hirchee, Sandra
    IEEE TRANSACTIONS ON HAPTICS, 2019, 12 (03) : 350 - 362
  • [48] Interaction Task Motion Learning for Human-Robot Interaction Control
    Lyu, Shangke
    Selvaraj, Nithish Muthuchamy
    Cheah, Chien Chern
    IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2022, 52 (05) : 894 - 906
  • [49] Human-Robot Interaction
    Jia, Yunyi
    Zhang, Biao
    Li, Miao
    King, Brady
    Meghdari, Ali
    JOURNAL OF ROBOTICS, 2018, 2018
  • [50] Human-Robot Interaction
    Sidobre, Daniel
    Broquere, Xavier
    Mainprice, Jim
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    ADVANCED BIMANUAL MANIPULATION: RESULTS FROM THE DEXMART PROJECT, 2012, 80 : 123 - +