Human intention recognition method based on context awareness and graph attention network for human-robot collaborative assembly

被引:0
|
作者
Yao D. [1 ,3 ]
Xu W. [1 ,3 ]
Yao B. [2 ,3 ]
Liu J. [1 ,3 ]
Ji Z. [1 ,3 ]
机构
[1] School of Information Engineering, Wuhan University of Technology, Wuhan
[2] School of Mechanical and Electronic Engineering, Wuhan University of Technology, Wuhan
[3] Hubei Key Laboratory of Broadband Wireless communication and Sensor Networks, Wuhan
关键词
context-aware; graph attention network; human intention recognition; human-robot collaborative assembly;
D O I
10.13196/j.cims.2021.0781
中图分类号
学科分类号
摘要
Aiming at the problem that existing research has not yet fully utilized the three-dimensional spatial and visual features of various elements in the complex assembly environment, and has not considered the rich contextual information in the assembly environment, making the accuracy of intention recognition low, a method combining the three-dimensional spatial and visual information of the elements in the assembly environment and realizes the high-precision recognition of the human worker's operation intention based on the graph attention network was proposed. The Faster R-CNN was utilized to detect various elements in the assembly scene such as human workers, robots, obtain the spatial information of each element, and extract the visual feature information of each element from the network. Then, the graph attention network was utilized to reason the human worker's interaction intention toward different parts during assembly, such as handling, assembly and dragging. A gear assembly case study was used to verify that the proposed method. The experiment result showed that the proposed method could achieve higher performance in recognition accuracy and scene generalization compared with the deep convolutional neural network. © 2024 CIMS. All rights reserved.
引用
收藏
页码:2005 / 2013
页数:8
相关论文
共 27 条
  • [1] WANG L, GAO K, VANCZA J, Et al., Symbiotic human-robot collaborative assembly, CIKP Annals, 68, 2, pp. 701-726, (2019)
  • [2] WANG L H, WANG XV., Context-aware human-robot collaborative assembly, Cloud-Based Cyber-Physical Systems in Manufacturing, pp. 261-294, (2018)
  • [3] XU W J, TANG Q, LIU J Y, Et al., Disassembly sequence planning using discrete bees algorithm for human-robot collaboration in remanufacturing, Robotics and Computer-Inte-grated Manufacturing, 62, (2020)
  • [4] LIU Q, LIU Z H, XU W J, Et al., Human-robot collaboration in disassembly for sustainable manufacturing, International Journal of Production Research, 57, 12, pp. 4027-4044, (2019)
  • [5] LI Hao, LIU Geng, WEX Xiaoyu, Et al., Industrial safety control system and key technologies of digital twin system oriented to human-machine interaction, Computer Integrated Manufacturing Systems, 27, 2, pp. 374-389, (2021)
  • [6] LIU H Y, WANG L H., Human motion prediction for human-robot collaboration, Journal of Manufacturing Systems, 44, pp. 287-294, (2017)
  • [7] XIONG Q Q, ZHANG J J, WANG P, Et al., Transferable two stream convolutional neural network for human action recognition, Journal of Manufacturing Systems, 56, pp. 605-614, (2020)
  • [8] ZHANG J J, LIU H Y, CHANG Q, Et al., Recurrent neural network for motion trajectory prediction in human-robot collaborative assembly, CIRP Annals, 69, 1, pp. 9-12, (2020)
  • [9] LIU Z T, LIU Q, XU W J, Et al., Deep learning-based human motion prediction considering context awareness for human-robot collaboration in manufacturing, Procedia CIRP, 83, pp. 272-278, (2019)
  • [10] WEX X H, CHEN H P., 3D long-term recurrent convolutional networks for human sub-assembly recognition in human-robot collabo-ration[J], Assembly Automation, 40, 4, pp. 655-662, (2020)