BypTalker: An Adaptive Adversarial Example Attack to Bypass Prefilter-enabled Speaker Recognition

被引:0
|
作者
Chen, Qianniu [1 ]
Fu, Kang [1 ]
Lu, Li [1 ]
Chen, Meng [1 ]
Ba, Zhongjie [1 ]
Lin, Feng [1 ]
Ren, Kui [1 ]
机构
[1] Zhejiang Univ, Sch Cyber Sci & Technol, Coll Comp Sci & Technol, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
10.1109/MSN60784.2023.00077
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the broad integration of deep learning in Speaker Recognition (SR) systems, adversarial example attacks have been a significant threat raising user security concerns. Nevertheless, recent studies demonstrate that using input transformations (e.g., re-quantization, resampling, bandpass filtering) as a low-cost prefilter can efficiently mitigate such adversarial example attacks. These prefilters constrain the injection space of adversarial perturbations in both time and frequency domains, leading to either degraded attack performance or amplified perturbation noise. This paper proposes a new adversarial example attack, BypTalker, which could bypass these prefilter-enabled SR systems while remaining imperceptible to human listeners. BypTalker employs ensemble learning with diverse substitute prefilters in the training phase to enhance the adversarial example's adaptiveness to different prefilters. Furthermore, it incorporates an Acoustic Masker to cloak adversarial perturbations based on psychoacoustics effectively. This masker is well selected from a proposed metric M-Sup for minimizing the perturbation's auditory to human perception. Experimental results show that BypTalker can achieve an Attack Success Rate of 99.1% and a Perceptual Evaluation of Speech Quality of 4.32, respectively.
引用
收藏
页码:496 / 503
页数:8
相关论文
共 20 条
  • [1] Push the Limit of Adversarial Example Attack on Speaker Recognition in Physical Domain
    Chen, Qianniu
    Chen, Meng
    Lu, Li
    Yu, Jiadi
    Chen, Yingying
    Wang, Zhibo
    Ba, Zhongjie
    Lin, Feng
    Ren, Kui
    PROCEEDINGS OF THE TWENTIETH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS, SENSYS 2022, 2022, : 710 - 724
  • [2] PhoneyTalker: An Out-of-the-Box Toolkit for Adversarial Example Attack on Speaker Recognition
    Chen, Meng
    Lu, Li
    Ba, Zhongjie
    Ren, Kui
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2022), 2022, : 1419 - 1428
  • [3] UNIVERSAL ADVERSARIAL ATTACK AGAINST SPEAKER RECOGNITION MODELS
    Hanina, Shoham
    Zolfi, Alon
    Elovici, Yuval
    Shabtai, Asaf
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 4860 - 4864
  • [4] Inaudible Adversarial Perturbations for Targeted Attack in Speaker Recognition
    Wang, Qing
    Guo, Pengcheng
    Xie, Lei
    INTERSPEECH 2020, 2020, : 4228 - 4232
  • [5] Adversarial Attack and Defense Strategies of Speaker Recognition Systems: A Survey
    Tan, Hao
    Wang, Le
    Zhang, Huan
    Zhang, Junjian
    Shafiq, Muhammad
    Gu, Zhaoquan
    ELECTRONICS, 2022, 11 (14)
  • [6] Adversarial attack and defense strategies for deep speaker recognition systems
    Jati, Arindam
    Hsu, Chin-Cheng
    Pal, Monisankha
    Peri, Raghuveer
    AbdAlmageed, Wael
    Narayanan, Shrikanth
    COMPUTER SPEECH AND LANGUAGE, 2021, 68
  • [7] Adaptive Adversarial Attack on Scene Text Recognition
    Yuan, Xiaoyong
    He, Pan
    Li, Xiaolin
    Wu, Dapeng
    IEEE INFOCOM 2020 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (INFOCOM WKSHPS), 2020, : 358 - 363
  • [8] Adaptive Adversarial Patch Attack on Face Recognition Models
    Yan, Bei
    Zhang, Jie
    Yuan, Zheng
    Shan, Shiguang
    2023 IEEE INTERNATIONAL JOINT CONFERENCE ON BIOMETRICS, IJCB, 2023,
  • [9] Adaptive fast and targeted adversarial attack for speech recognition
    Zhang S.
    Gao H.
    Cao X.
    Kang S.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2021, 48 (01): : 168 - 175
  • [10] A Highly Stealthy Adaptive Decay Attack Against Speaker Recognition
    Zhang, Xinyu
    Xu, Yang
    Zhang, Sicong
    Li, Xiaojian
    IEEE ACCESS, 2022, 10 : 118789 - 118805