Human-Robot Interaction Using Learning from Demonstrations and a Wearable Glove with Multiple Sensors

被引:3
|
作者
Singh, Rajmeet [1 ]
Mozaffari, Saeed [1 ]
Akhshik, Masoud [1 ]
Ahamed, Mohammed Jalal [1 ]
Rondeau-Gagne, Simon [2 ]
Alirezaee, Shahpour [1 ]
机构
[1] Univ Windsor, Mech Automot & Mat Engn Dept, Windsor, ON N9B 3P4, Canada
[2] Univ Windsor, Dept Chem & Biochem, Windsor, ON N9B 3P4, Canada
关键词
robotic grasping; human-robot interaction; inertia; pressure; flexi sensors; wearable devices; learning from demonstration;
D O I
10.3390/s23249780
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Human-robot interaction is of the utmost importance as it enables seamless collaboration and communication between humans and robots, leading to enhanced productivity and efficiency. It involves gathering data from humans, transmitting the data to a robot for execution, and providing feedback to the human. To perform complex tasks, such as robotic grasping and manipulation, which require both human intelligence and robotic capabilities, effective interaction modes are required. To address this issue, we use a wearable glove to collect relevant data from a human demonstrator for improved human-robot interaction. Accelerometer, pressure, and flexi sensors were embedded in the wearable glove to measure motion and force information for handling objects of different sizes, materials, and conditions. A machine learning algorithm is proposed to recognize grasp orientation and position, based on the multi-sensor fusion method.
引用
收藏
页数:17
相关论文
共 50 条
  • [31] BioMot Exoskeleton - Towards a Smart Wearable Robot for Symbiotic Human-Robot Interaction
    Bacek, Tomislav
    Moltedo, Marta
    Langlois, Kevin
    Asin Prieto, Guillermo
    Carmen Sanchez-Villamanan, Maria
    Gonzalez-Vargas, Jose
    Vanderborght, Bram
    Lefeber, Dirk
    Moreno, Juan C.
    2017 INTERNATIONAL CONFERENCE ON REHABILITATION ROBOTICS (ICORR), 2017, : 1666 - 1671
  • [32] Signs of Language: Embodied Sign Language Fingerspelling Acquisition from Demonstrations for Human-Robot Interaction
    Tavella, Federico
    Galata, Aphrodite
    Cangelosi, Angelo
    2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 1137 - 1143
  • [33] Adaptive technique for physical human-robot interaction handling using proprioceptive sensors
    Popov, Dmitry
    Pashkevich, Anatol
    Klimchik, Alexandr
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 126
  • [34] Wearable Robot Design Optimization Using Closed-Form Human-Robot Dynamic Interaction Model
    Shahabpoor, Erfan
    Gray, Bethany
    Plummer, Andrew
    SENSORS, 2024, 24 (13)
  • [35] Learning representations for robust human-robot interaction
    Kuo, Yen-Ling
    AI MAGAZINE, 2024, 45 (04) : 561 - 568
  • [36] Emotionally Assisted Human-Robot Interaction Using a Wearable Device for Reading Facial Expressions
    Gruebler, Anna
    Berenz, Vincent
    Suzuki, Kenji
    ADVANCED ROBOTICS, 2012, 26 (10) : 1143 - 1159
  • [37] Learning Representations for Robust Human-Robot Interaction
    Kuo, Yen-Ling
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 20, 2024, : 22673 - 22673
  • [38] Learning and Comfort in Human-Robot Interaction: A Review
    Wang, Weitian
    Chen, Yi
    Li, Rui
    Jia, Yunyi
    APPLIED SCIENCES-BASEL, 2019, 9 (23):
  • [39] Incremental learning of gestures for human-robot interaction
    Okada, Shogo
    Kobayashi, Yoichi
    Ishibashi, Satoshi
    Nishida, Toyoaki
    AI & SOCIETY, 2010, 25 (02) : 155 - 168
  • [40] Learning Human-Arm Reaching Motion Using a Wearable Device in Human-Robot Collaboration
    Kahanowich, Nadav D.
    Sintov, Avishai
    IEEE ACCESS, 2024, 12 : 24855 - 24865