High-Performance Transformer Tracking

被引:17
|
作者
Chen, Xin [1 ,2 ]
Yan, Bin [1 ,2 ]
Zhu, Jiawen [1 ,2 ]
Lu, Huchuan [1 ,2 ]
Ruan, Xiang [3 ]
Wang, Dong [1 ,2 ]
机构
[1] Dalian Univ Technol, Sch Informat & Commun Engn, Dalian 116024, Liaoning, Peoples R China
[2] Dalian Univ Technol, Ningbo Inst, Ningbo 315016, Zhejiang, Peoples R China
[3] Tiwaki Co Ltd, Kusatsu, Shiga 5258577, Japan
基金
中国国家自然科学基金;
关键词
Transformers; Target tracking; Correlation; Magnetic heads; Feature extraction; Semantics; Head; Cross-attention; object tracking; self-attention; siamese tracking; transformer; VISUAL TRACKING;
D O I
10.1109/TPAMI.2022.3232535
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Correlation has a critical role in the tracking field, especially in recent popular Siamese-based trackers. The correlation operation is a simple fusion method that considers the similarity between the template and the search region. However, the correlation operation is a local linear matching process, losing semantic information and easily falling into a local optimum, which may be the bottleneck in designing high-accuracy tracking algorithms. In this work, to determine whether a better feature fusion method exists than correlation, a novel attention-based feature fusion network, inspired by the transformer, is presented. This network effectively combines the template and search region features using attention mechanism. Specifically, the proposed method includes an ego-context augment module based on self-attention and a cross-feature augment module based on cross-attention. First, we present a transformer tracking (named TransT) method based on the Siamese-like feature extraction backbone, the designed attention-based fusion mechanism, and the classification and regression heads. Based on the TransT baseline, we also design a segmentation branch to generate the accurate mask. Finally, we propose a stronger version of TransT by extending it with a multi-template scheme and an IoU prediction head, named TransT-M. Experiments show that our TransT and TransT-M methods achieve promising results on seven popular benchmarks. Code and models are available at https://github.com/chenxin-dlut/TransT-M.
引用
收藏
页码:8507 / 8523
页数:17
相关论文
共 50 条
  • [1] Transformer Sub-Patch Matching for High-Performance Visual Object Tracking
    Tang, Chuanming
    Hu, Qintao
    Zhou, Gaofan
    Yao, Jinzhen
    Zhang, Jianlin
    Huang, Yongmei
    Ye, Qixiang
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (08) : 8121 - 8135
  • [2] HIGH-PERFORMANCE FAST FOURIER TRANSFORMER
    FLADUNG, RL
    MERGLER, HW
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS AND CONTROL INSTRUMENTATION, 1978, 25 (04): : 322 - 326
  • [3] High-performance template tracking
    Cabido, R.
    Montemayor, A. S.
    Pantrigo, J. J.
    Martinez-Zarzuela, M.
    Payne, B. R.
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2012, 23 (02) : 271 - 286
  • [4] High-performance tracking system
    Huang, JT
    Wang, JZ
    IMAGE ANALYSIS APPLICATIONS AND COMPUTER GRAPHICS, 1995, 1024 : 17 - 24
  • [5] High-Performance Discriminative Tracking with Transformers
    Yu, Bin
    Tang, Ming
    Zheng, Linyu
    Zhu, Guibo
    Wang, Jinqiao
    Feng, Hao
    Feng, Xuetao
    Lu, Hanqing
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9836 - 9845
  • [6] EVALUATION OF THE STABILITY OF HIGH-PERFORMANCE TRANSFORMER OILS.
    Grechko, O.N.
    Levit, A.G.
    Lozovskaya, E.N.
    Soviet electrical engineering, 1978, 49 (12): : 96 - 98
  • [7] A high-performance MEMS transformer for silicon RF ICS
    Choi, YS
    Yoon, JB
    Kim, BI
    Yoon, ES
    FIFTEENTH IEEE INTERNATIONAL CONFERENCE ON MICRO ELECTRO MECHANICAL SYSTEMS, TECHNICAL DIGEST, 2002, : 653 - 656
  • [8] Spikeformer: Training high-performance spiking neural network with transformer
    Li, Yudong
    Lei, Yunlin
    Yang, Xu
    NEUROCOMPUTING, 2024, 574
  • [9] Research on Preparation and Application of High-performance Hydrogenated Transformer Oil
    Qian Yihua
    Huang Yibin
    Zhang Yan
    Zhong Zhensheng
    ENERGY AND POWER TECHNOLOGY, PTS 1 AND 2, 2013, 805-806 : 1376 - +
  • [10] LOW-COST HIGH-PERFORMANCE TRACKING A/D
    ZIS, J
    ELECTRONIC PRODUCTS MAGAZINE, 1972, 15 (02): : 55 - &