Robust Physical-World Attacks on Face Recognition

被引:26
|
作者
Zheng, Xin [1 ]
Fan, Yanbo [2 ]
Wu, Baoyuan [3 ]
Zhang, Yong [2 ]
Wang, Jue [2 ]
Pan, Shirui [4 ]
机构
[1] Monash Univ, Melbourne, Vic, Australia
[2] Tencent AI Lab, Shenzhen, Peoples R China
[3] Chinese Univ Hong Kong, Shenzhen Res Inst Big Data, Sch Data Sci, Shenzhen, Peoples R China
[4] Griffith Univ, Sch Informat & Commun Technol, Gold Coast, Qld, Australia
基金
中国国家自然科学基金;
关键词
Physical -world adversarial attack; Face recognition; Environmental variations; Curriculum learning;
D O I
10.1016/j.patcog.2022.109009
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Face recognition has been greatly facilitated by the development of deep neural networks (DNNs) and has been widely applied to many safety-critical applications. However, recent studies have shown that DNNs are very vulnerable to adversarial examples, raising severe concerns on the security of real-world face recognition. In this work, we study sticker-based physical attacks on face recognition for better un-derstanding its adversarial robustness. To this end, we first analyze in-depth the complicated physical -world conditions confronted by attacking face recognition, including the different variations of stickers, faces, and environmental conditions. Then, we propose a novel robust physical attack framework, dubbed PadvFace, to model these challenging variations specifically. Furthermore, we reveal that the attack com-plexities vary under different physical-world conditions and propose an efficient Curriculum Adversarial Attack (CAA) algorithm that gradually adapts adversarial stickers to environmental variations from easy to complex. Finally, we construct a standardized testing protocol to facilitate the fair evaluation of phys-ical attacks on face recognition, and extensive experiments on both physical dodging and impersonation attacks demonstrate the superior performance of the proposed method.(c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] Guest Editorial: Face Recognition and Spoofing Attacks
    Correia, Paulo Lobato
    Hadid, Abdenour
    IET BIOMETRICS, 2018, 7 (01) : 1 - 2
  • [32] PTB: Robust physical backdoor attacks against deep neural networks in real world
    Xue, Mingfu
    He, Can
    Wu, Yinghao
    Sun, Shichang
    Zhang, Yushu
    Wang, Jian
    Liu, Weiqiang
    COMPUTERS & SECURITY, 2022, 118
  • [33] Adversarial color projection: A projector-based physical-world attack to DNNs
    Hu, Chengyin
    Shi, Weiwen
    Tian, Ling
    IMAGE AND VISION COMPUTING, 2023, 140
  • [34] Face Synthesis for Eyeglass-Robust Face Recognition
    Guo, Jianzhu
    Zhu, Xiangyu
    Lei, Zhen
    Li, Stan Z.
    BIOMETRIC RECOGNITION, CCBR 2018, 2018, 10996 : 275 - 284
  • [35] Face3DAdv: Exploiting Robust Adversarial 3D Patches on Physical Face Recognition
    Yang, Xiao
    Xu, Longlong
    Pang, Tianyu
    Dong, Yinpeng
    Wang, Yikai
    Su, Hang
    Zhu, Jun
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2025, 133 (01) : 353 - 371
  • [36] Face Recognition System Robust to Occlusion
    Sharma, Mohit
    Prakash, Surya
    Gupta, Phalguni
    BIO-INSPIRED COMPUTING AND APPLICATIONS, 2012, 6840 : 604 - 609
  • [37] Robust face recognition with light compensation
    Huang, YS
    Tsai, YH
    Shieh, JW
    ADVANCES IN MUTLIMEDIA INFORMATION PROCESSING - PCM 2001, PROCEEDINGS, 2001, 2195 : 237 - 244
  • [38] Robust spectral regression for face recognition
    Guo, Yanqing
    He, Ran
    Zheng, Wei-Shi
    Kong, Xiangwei
    He, Zhaofeng
    NEUROCOMPUTING, 2013, 118 : 33 - 40
  • [39] Robust Sparse Coding for Face Recognition
    Yang, Meng
    Zhang, Lei
    Yang, Jian
    Zhang, David
    2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011, : 625 - 632
  • [40] Background learning for robust face recognition
    Singh, RK
    Rajagopalan, AN
    16TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL III, PROCEEDINGS, 2002, : 525 - 528