Machine Motion Trajectory Detection Based on Siamese Graph-Attention Adaptive Network

被引:0
|
作者
Yu, Jianbo [1 ]
Huang, Yihao [1 ]
Gao, Yanfeng [2 ]
Li, Qingfeng [3 ]
机构
[1] Tongji Univ, Sch Mech Engn, Shanghai 201804, Peoples R China
[2] Shanghai Univ Engn Sci, Shanghai Large Component Intelligent Mfg Robot Tec, Shanghai 201620, Peoples R China
[3] Beihang Univ, Hangzhou Innovat Inst, Res Ctr Big Data & Computat Intelligence, Hangzhou 100191, Peoples R China
基金
中国国家自然科学基金;
关键词
Anchor-free mechanism; graph attention; machine motion; Siamese network; trajectory detection; TRACKING;
D O I
10.1109/TIM.2023.3290311
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
There are many demands for motion trajectory detection of machines in intelligent manufacturing. The existing detection methods may lead to errors or loss of target motion trajectory. An anti-interference machine motion trajectory detection method based on the Siamese graph-attention adaptive network (SiamGAAN) is proposed in this article. A graph attention-based feature matching method is proposed in the feature extraction network of SiamGAAN, which transfers the template feature information to the search feature instead of the traditional cross correlation method. The regression network based on an adaptive anchor-free mechanism can directly predict the distance between the sampling points and the boundary of the target region. A quality determination-based index is proposed in the classification network to improve the accuracy of the target box. SiamGAAN can continuously detect the motion and collect the tracking information of machines in the complex environment of the workshop. SiamGAAN can achieve a 1.5% accuracy rate and 1.9% tracking success rate improvement compared with those typical methods on the benchmark data, i.e., OTB2015. A tracking success rate of 90.58% is achieved in the robotic arm's motion trajectory that simulates the workshop's real operation.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Graph-Based Siamese Network for Authorship Verification
    Embarcadero-Ruiz, Daniel
    Gomez-Adorno, Helena
    Embarcadero-Ruiz, Alberto
    Sierra, Gerardo
    MATHEMATICS, 2022, 10 (02)
  • [32] Multi-Order-Content-Based Adaptive Graph Attention Network for Graph Node Classification
    Chen, Yong
    Xie, Xiao-Zhu
    Weng, Wei
    He, Yi-Fan
    SYMMETRY-BASEL, 2023, 15 (05):
  • [33] ACGVD: Vulnerability Detection Based on Comprehensive Graph via Graph Neural Network with Attention
    Li, Min
    Li, Chunfang
    Li, Shuailou
    Wu, Yanna
    Zhang, Boyang
    Wen, Yu
    INFORMATION AND COMMUNICATIONS SECURITY (ICICS 2021), PT I, 2021, 12918 : 243 - 259
  • [34] EEG-based Auditory Attention Detection with Spatiotemporal Graph and Graph Convolutional Network
    Wang, Ruicong
    Cai, Siqi
    Li, Haizhou
    INTERSPEECH 2023, 2023, : 1144 - 1148
  • [35] EGAT: Extended Graph Attention Network for Pedestrian Trajectory Prediction
    Kong, Wei
    Liu, Yun
    Li, Hui
    Wang, Chuanxu
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [36] SGAMTE-Net: A pedestrian trajectory prediction network based on spatiotemporal graph attention and multimodal trajectory endpoints
    Xin Yang
    Liao Bingxian
    Wang Xiangcheng
    Applied Intelligence, 2023, 53 : 31165 - 31180
  • [37] PTPGC: Pedestrian trajectory prediction by graph attention network with ConvLSTM
    Yang, Juan
    Sun, Xu
    Wang, Rong Gui
    Xue, Li Xia
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2022, 148
  • [38] Sparse Attention Graph Convolution Network for Vehicle Trajectory Prediction
    Chen, Chongpu
    Chen, Xinbo
    Yang, Yi
    Hang, Peng
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (12) : 18294 - 18306
  • [39] PTPGC: Pedestrian trajectory prediction by graph attention network with ConvLSTM
    Yang, Juan
    Sun, Xu
    Wang, Rong Gui
    Xue, Li Xia
    Robotics and Autonomous Systems, 2022, 148
  • [40] SGAMTE-Net: A pedestrian trajectory prediction network based on spatiotemporal graph attention and multimodal trajectory endpoints
    Xin, Yang
    Liao, Bingxian
    Wang, Xiangcheng
    APPLIED INTELLIGENCE, 2023, 53 (24) : 31165 - 31180