TRIPLET DISTILLATION FOR DEEP FACE RECOGNITION

被引:0
|
作者
Feng, Yushu [1 ]
Wang, Huan [1 ]
Hu, Haoji [1 ]
Yu, Lu [1 ]
Wang, Wei [2 ]
Wang, Shiyan [2 ]
机构
[1] Zhejiang Univ, Coll Informat Sci & Elect Engn, Hangzhou, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Chongqing, Peoples R China
关键词
Face Recognition; Knowledge Distillation; Triplet Loss; Network Compression;
D O I
暂无
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Convolutional neural networks (CNNs) have achieved great successes in face recognition, which unfortunately comes at the cost of massive computation and storage consumption. Many compact face recognition networks are thus proposed to resolve this problem, and triplet loss is effective to further improve the performance of these compact models. However, it normally employs a fixed margin to all the samples, which neglects the informative similarity structures between different identities. In this paper, we borrow the idea of knowledge distillation and define the informative similarity as the transferred knowledge. Then, we propose an enhanced version of triplet loss, named triplet distillation, which exploits the capability of a teacher model to transfer the similarity information to a student model by adaptively varying the margin between positive and negative pairs. Experiments on the LFW, AgeDB and CPLFW datasets show the merits of our method compared to the original triplet loss.
引用
收藏
页码:808 / 812
页数:5
相关论文
共 50 条
  • [1] Grouped Knowledge Distillation for Deep Face Recognition
    Zhao, Weisong
    Zhu, Xiangyu
    Guo, Kaiwen
    Zhang, Xiao-Yu
    Lei, Zhen
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 3, 2023, : 3615 - 3623
  • [2] AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
    Boutros, Fadi
    Struc, Vitomir
    Damer, Naser
    COMPUTER VISION - ECCV 2024, PT LV, 2025, 15113 : 163 - 182
  • [3] Evaluation-oriented Knowledge Distillation for Deep Face Recognition
    Huang, Yuge
    Wu, Jiaxiang
    Xu, Xingkun
    Ding, Shouhong
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 18719 - 18728
  • [4] Light Deep Face Recognition based on Knowledge Distillation and Adversarial Training
    Liu, Jinjin
    Li, Xiaonan
    2022 INTERNATIONAL CONFERENCE ON MECHANICAL, AUTOMATION AND ELECTRICAL ENGINEERING, CMAEE, 2022, : 127 - 132
  • [5] Training Deep Face Recognition for Efficient Inference by Distillation and Mutual Learning
    Shen, Guodong
    Shen, Yao
    RiaZ, M. Naveed
    PROCEEDINGS OF THE 2018 IEEE INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATICS AND COMPUTING (PIC), 2018, : 38 - 43
  • [6] Enhanced Knowledge Distillation for Face Recognition
    Ni, Hao
    Shen, Jie
    Yuan, Chong
    2019 IEEE INTL CONF ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, BIG DATA & CLOUD COMPUTING, SUSTAINABLE COMPUTING & COMMUNICATIONS, SOCIAL COMPUTING & NETWORKING (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2019), 2019, : 1441 - 1444
  • [7] Deep Metric Learning with Triplet-Margin-Center Loss for Sketch Face Recognition
    Feng, Yujian
    Wu, Fei
    Ji, Yimu
    Jing, Xiao-Yuan
    Yu, Jian
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2020, E103D (11): : 2394 - 2397
  • [8] Deep Metric Learning with Triplet-Margin-Center Loss for Sketch Face Recognition
    Feng Y.
    Wu F.
    Ji Y.
    Jing X.-Y.
    Yu J.
    IEICE Transactions on Information and Systems, 2020, E103D (11): : 2394 - 2397
  • [9] A single-stage face detection and face recognition deep neural network based on feature pyramid and triplet loss
    Tsai, Tsung-Han
    Chi, Po-Ting
    IET IMAGE PROCESSING, 2022, 16 (08) : 2148 - 2156
  • [10] CoupleFace: Relation Matters for Face Recognition Distillation
    Liu, Jiaheng
    Qin, Haoyu
    Wu, Yichao
    Guo, Jinyang
    Liang, Ding
    Xu, Ke
    COMPUTER VISION, ECCV 2022, PT XII, 2022, 13672 : 683 - 700