Determining Movement Measures for Trust Assessment in Human-Robot Collaboration Using IMU-Based Motion Tracking

被引:1
|
作者
Hald, Kasper [1 ]
Rehm, Matthias [1 ]
机构
[1] Aalborg Univ, Dept Architecture Design & Media Technol, Rendsburggade 14, DK-9000 Aalborg, Denmark
基金
欧盟地平线“2020”;
关键词
D O I
10.1109/RO-MAN57019.2023.10309497
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Close-proximity human-robot collaboration (HRC) requires an appropriate level of trust from the operator to the robot to maintain safety and efficiency. Maintaining an appropriate trust level during robot-aided production requires non-obstructive real-time human-robot trust assessment. To this end we performed an experiment with 20 participants performing two types of HRC tasks in close proximity to a Kuka KR 300 R2500 ultra robot. The two tasks involved collaborative transport of textiles and collaborative draping, respectively. During the experiment we performed full body motion tracking and administered human-robot trust questionnaires in order investigate the correlation between trust and operator movement patterns. From the initial per-session analyses we see the effects of task types on movement patterns, but the correlations with trust are weak overall. Further analysis at higher temporal resolution and with correction for participants' base movement patterns are required.
引用
收藏
页码:1267 / 1272
页数:6
相关论文
共 50 条
  • [31] An electromyography signals-based human-robot collaboration system for human motion intention recognition and realization
    Zhang, Tie
    Sun, Hanlei
    Zou, Yanbiao
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2022, 77
  • [32] Robust, Multimodal Capacitive Sensors Using Carbon Nanotube Paper Composites for Human and Robot Finger Tracking in Human-Robot Collaboration
    Kim, Shawn
    Kim, Dohoon
    Chung, Jake
    Cheng, Yu-Jen
    Li, Tianyi
    Ahn, Sanggyeun
    Kim, Heung Soo
    Chung, Jae-Hyun
    ADVANCED MATERIALS TECHNOLOGIES, 2025,
  • [33] Iterative Learning Control Based on Stretch and Compression Mapping for Trajectory Tracking in Human-robot Collaboration
    Xia, Jingkang
    Huang, Deqing
    Li, Yanan
    Zhong, Junpei
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 3905 - 3910
  • [34] Risk response capability assessment for the digital twin-based human-robot collaboration
    Liu, Xin
    Li, Gongfa
    Xiang, Feng
    Tao, Bo
    Jiang, Guozhang
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2025, 137 (1-2): : 297 - 318
  • [35] Mutual trust-based subtask allocation for human-robot collaboration in flexible lightweight assembly in manufacturing
    Rahman, S. M. Mizanoor
    Wang, Yue
    MECHATRONICS, 2018, 54 : 94 - 109
  • [36] Deep Learning-based Human Motion Prediction considering Context Awareness for Human-Robot Collaboration in Manufacturing
    Liu, Zitong
    Liu, Quan
    Xu, Wenjun
    Liu, Zhihao
    Zhou, Zude
    Chen, Jie
    11TH CIRP CONFERENCE ON INDUSTRIAL PRODUCT-SERVICE SYSTEMS, 2019, 83 : 272 - 278
  • [37] Deep learning-based human motion recognition for predictive context-aware human-robot collaboration
    Wang, Peng
    Liu, Hongyi
    Wang, Lihui
    Gao, Robert X.
    CIRP ANNALS-MANUFACTURING TECHNOLOGY, 2018, 67 (01) : 17 - 20
  • [38] EEG based arm movement intention recognition towards enhanced safety in symbiotic Human-Robot Collaboration
    Buerkle, Achim
    Eaton, William
    Lohse, Niels
    Bamber, Thomas
    Wolfson, Pedro Ferreira
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2021, 70
  • [39] Design and Quantitative Assessment of Teleoperation-Based Human-Robot Collaboration Method for Robot-Assisted Sonography
    Si, Weiyong
    Wang, Ning
    Yang, Chenguang
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2025, 22 : 317 - 327
  • [40] Neural admittance control based on motion intention estimation and force feedforward compensation for human-robot collaboration
    Ai, Wenxu
    Pan, Xinan
    Jiang, Yong
    Wang, Hongguang
    INTERNATIONAL JOURNAL OF INTELLIGENT ROBOTICS AND APPLICATIONS, 2024, 8 (03) : 560 - 573