Learning Attentions: Residual Attentional Siamese Network for High Performance Online Visual Tracking

被引:473
|
作者
Wang, Qiang
Teng, Zhu
Xing, Junliang
Gao, Jin
Hu, Weiming
Maybank, Stephen
机构
来源
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2018年
关键词
D O I
10.1109/CVPR.2018.00510
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Offline training for object tracking has recently shown great potentials in balancing tracking accuracy and speed. However, it is still difficult to adapt an offline trained model to a target tracked online. This work presents a Residual Attentional Siamese Network (RASNet) for high performance object tracking. The RASNet model reformulates the correlation filter within a Siamese tracking framework, and introduces different kinds of the attention mechanisms to adapt the model without updating the model online. In particular, by exploiting the offline trained general attention, the target adapted residual attention, and the channel favored feature attention, the RASNet not only mitigates the over-fitting problem in deep network training, but also enhances its discriminative capacity and adaptability due to the separation of representation learning and discriminator learning. The proposed deep architecture is trained from end to end and takes full advantage of the rich spatial temporal information to achieve robust visual tracking. Experimental results on two latest benchmarks, OTB-2015 and VOT2017, show that the RASNet tracker has the state-of-the-art tracking accuracy while runs at more than 80 frames per second.
引用
收藏
页码:4854 / 4863
页数:10
相关论文
共 50 条
  • [1] Siamese attentional keypoint network for high performance visual tracking
    Gao, Peng
    Yuan, Ruyue
    Wang, Fei
    Xiao, Liyi
    Fujita, Hamido
    Zhang, Yan
    KNOWLEDGE-BASED SYSTEMS, 2020, 193
  • [2] SiamRAAN: Siamese Residual Attentional Aggregation Network for Visual Object Tracking
    Zhiyi Xin
    Junyang Yu
    Xin He
    Yalin Song
    Han Li
    Neural Processing Letters, 56
  • [3] SiamRAAN: Siamese Residual Attentional Aggregation Network for Visual Object Tracking
    Xin, Zhiyi
    Yu, Junyang
    He, Xin
    Song, Yalin
    Li, Han
    NEURAL PROCESSING LETTERS, 2024, 56 (02)
  • [4] Dual Attentional Siamese Network for Visual Tracking
    Zhang Xiaowei
    Ma Jianwei
    Liu Hong
    Hu Hai-Miao
    Yang Peng
    DISPLAYS, 2022, 74
  • [5] Multi-branch Siamese Network for High Performance Online Visual Tracking
    Zhuang, Junfei
    Dong, Yuan
    Bai, Hongliang
    Wang, Gang
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 101, 2019, 101 : 519 - 534
  • [6] SiamATL: Online Update of Siamese Tracking Network via Attentional Transfer Learning
    Huang, Bo
    Xu, Tingfa
    Shen, Ziyi
    Jiang, Shenwang
    Zhao, Bingqing
    Bian, Ziyang
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (08) : 7527 - 7540
  • [7] Siamese residual network for efficient visual tracking
    Fan, Nana
    Liu, Qiao
    Li, Xin
    Zhou, Zikun
    He, Zhenyu
    INFORMATION SCIENCES, 2023, 624 : 606 - 623
  • [8] Siamese Visual Tracking With Residual Fusion Learning
    Sun, Xinglong
    Han, Guangliang
    Guo, Lihong
    IEEE ACCESS, 2022, 10 : 88421 - 88433
  • [9] Online Siamese Network for Visual Object Tracking
    Chang, Shuo
    Li, Wei
    Zhang, Yifan
    Feng, Zhiyong
    SENSORS, 2019, 19 (08)
  • [10] Siamese Attentional Cascade Keypoints Network for Visual Object Tracking
    Wang, Ershen
    Wang, Donglei
    Huang, Yufeng
    Tong, Gang
    Xu, Song
    Pang, Tao
    IEEE ACCESS, 2021, 9 : 7243 - 7254