Attribute saliency network for person re-identification

被引:3
|
作者
Tay, Chiat-Pin [1 ]
Yap, Kim-Hui [1 ]
机构
[1] Nanyang Technol Univ, Singapore, Singapore
关键词
Person re-identification; Person attribute; Attention or saliency map; Attribute learning; RECOGNITION;
D O I
10.1016/j.imavis.2021.104298
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposes the Attribute Saliency Network (ASNet), a deep learning model that utilizes attribute and saliency map learning for person re-identification (re-ID) task. Many re-ID methods used human pose or local body parts, either fixed position or auto-learn, to guide the learning. Person attributes, though can describe a person in greater details, are seldom used in retrieving the person's images. We therefore propose to integrate the person attributes learning into the re-ID model, and let it learns together with the person identity networks. With this arrangement, there is a synergistic effect and thus better representations are encoded. In addition, both visual and text retrievals, such as query by clothing colors, hair length, etc., are possible. We also propose to improve the granularity of the heatmap, by generating two global person attributes and body part saliency maps to capture fine-grained details of the person and thus enhance the discriminative power of the encoded vectors. As a result, we are able to achieve state-of-the-art performances. On the Market1501 dataset, we achieve 90.5% mAP and 96.3% Rank 1 accuracy. On DukeMTMC-reID, we obtained 82.7% mAP and 90.6% Rank 1 accuracy. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Improving person re-identification by attribute and identity learning
    Lin, Yutian
    Zheng, Liang
    Zheng, Zhedong
    Wu, Yu
    Hu, Zhilan
    Yan, Chenggang
    Yang, Yi
    PATTERN RECOGNITION, 2019, 95 : 151 - 161
  • [22] Weak saliency ensemble network for person Re-identification using infrared light images
    Jeong, Min Su
    Jeong, Seong In
    Lee, Dong Chan
    Jung, Seung Yong
    Park, Kang Ryoung
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 139
  • [23] S2-Net:Semantic and Saliency Attention Network for Person Re-Identification
    Ren, Xuena
    Zhang, Dongming
    Bao, Xiuguo
    Zhang, Yongdong
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 4387 - 4399
  • [24] Person re-identification network based on weight-driven saliency hierarchical utilization
    Yan, Pu
    Tang, Qingwei
    Chen, Jie
    Wang, Gang
    Fang, Yue
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (03)
  • [25] Grafted network for person re-identification
    Wang, Jiabao
    Li, Yang
    Jiao, Shanshan
    Miao, Zhuang
    Zhang, Rui
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2020, 80
  • [26] Person Re-Identification by Siamese Network
    Shebiah, R. Newlin
    Arivazhagan, S.
    Amrith, S. G.
    Adarsh, S.
    INTELIGENCIA ARTIFICIAL-IBEROAMERICAL JOURNAL OF ARTIFICIAL INTELLIGENCE, 2023, 26 (71): : 25 - 33
  • [27] Relation Network for Person Re-Identification
    Park, Hyunjong
    Ham, Bumsub
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 11839 - 11847
  • [28] Doppelganger Saliency: Towards More Ethical Person Re-Identification
    RichardWebster, Brandon
    Hu, Brian
    Fieldhouse, Keith
    Hoogs, Anthony
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 2846 - 2856
  • [29] Saliency-Based Person Re-identification by Probability Histogram
    Zhang, Zongyan
    Zhao, Cairong
    Miao, Duoqian
    Wang, Xuekuan
    Lai, Zhihui
    Yang, Jian
    COMPUTER VISION - ACCV 2016 WORKSHOPS, PT III, 2017, 10118 : 315 - 329
  • [30] Attribute Memory Transfer Network for Unsupervised Cross-Domain Person Re-Identification
    Zheng, Xiaochen
    Sun, Hongwei
    Tian, Xijiang
    Li, Ye
    He, Gewen
    Fan, Fangfang
    IEEE ACCESS, 2020, 8 : 186951 - 186962