A multi-task minutiae transformer network for fingerprint recognition of young children

被引:0
|
作者
Liu, Manhua [1 ]
Liu, Aitong [1 ]
Shi, Yelin [1 ]
Liu, Shuxin [2 ]
机构
[1] Shanghai Jiao Tong Univ, Artificial Intelligence Inst, Sch Elect Informat & Elect Engn, MoE Key Lab Artificial Intelligence, Shanghai 200240, Peoples R China
[2] Shanghai Dianji Univ, Sch Elect Engn, Shanghai, Peoples R China
关键词
Fingerprint recognition of young children; Multi-task learning; Fingerprint enhancement; Minutiae extraction; Transformer; EXTRACTION; POSE;
D O I
10.1016/j.eswa.2025.126825
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Fingerprint recognition of children have attracted increasing attention for real applications such as identity certificate. However, the recognition performance is greatly reduced if the existing systems are directly used on the fingerprints of young children due to their low resolution and poor image quality. Towards more accurate fingerprint recognition of young children, this paper proposes multi-task deep learning framework based on Pyramid Densely-connected U-shaped Swin-transformer network (PDUSwin-Net) to jointly learn the reconstruction of enhanced high-resolution images and detection of minutiae points, which is compatible with existing adult fingerprint sensors (500 dpi) and minutiae matchers. First, a pyramid densely-connected Ushaped convolutional network is proposed to learn the features of fingerprints for multiple tasks. Then, a swin-transformer attention block is added to model the correlations of long-spatial features. In the decoding part, two branches are built for the tasks of fingerprint enhancement and minutiae extraction. Finally, our method is tested with the existing matchers on two independent fingerprint datasets of young children aged from 0-2 years. Results and comparison show that our method performs better than other methods for fingerprint recognition of young children.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Task Switching Network for Multi-task Learning
    Sun, Guolei
    Probst, Thomas
    Paudel, Danda Pani
    Popovic, Nikola
    Kanakis, Menelaos
    Patel, Jagruti
    Dai, Dengxin
    Van Gool, Luc
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 8271 - 8280
  • [42] Multi-Task Transformer with Adaptive Cross-Entropy Loss for Multi-Dialect Speech Recognition
    Dan, Zhengjia
    Zhao, Yue
    Bi, Xiaojun
    Wu, Licheng
    Ji, Qiang
    ENTROPY, 2022, 24 (10)
  • [43] Multi-EmoNet: A Novel Multi-Task Neural Network for Driver Emotion Recognition
    Cui, Yaodong
    Ma, Yintao
    Li, Wenbo
    Bian, Ning
    Li, Guofa
    Cao, Dongpu
    IFAC PAPERSONLINE, 2020, 53 (05): : 650 - 655
  • [44] A Neural Network Model for Online Multi-Task Multi-Label Pattern Recognition
    Higuchi, Daisuke
    Ozawa, Seiichi
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2013, 2013, 8131 : 162 - 169
  • [45] MTT: an efficient model for encrypted network traffic classification using multi-task transformer
    Zheng, Weiping
    Zhong, Jianhao
    Zhang, Qizhi
    Zhao, Gansen
    APPLIED INTELLIGENCE, 2022, 52 (09) : 10741 - 10756
  • [46] A Pseudo-task Design in Multi-task Learning Deep Neural Network for Speaker Recognition
    Lu, Xugang
    Shen, Peng
    Tsao, Yu
    Kawai, Hisashi
    2016 10TH INTERNATIONAL SYMPOSIUM ON CHINESE SPOKEN LANGUAGE PROCESSING (ISCSLP), 2016,
  • [47] MTT: an efficient model for encrypted network traffic classification using multi-task transformer
    Weiping Zheng
    Jianhao Zhong
    Qizhi Zhang
    Gansen Zhao
    Applied Intelligence, 2022, 52 : 10741 - 10756
  • [48] Binary Spectral Minutiae Representation with Multi-Sample Fusion For Fingerprint Recognition
    Xu, Haiyun
    Veldhuis, Raymond N. J.
    MM&SEC 2010: 2010 ACM SIGMM MULTIMEDIA AND SECURITY WORKSHOP, PROCEEDINGS, 2010, : 73 - 79
  • [49] Prompt Guided Transformer for Multi-Task Dense Prediction
    Lu, Yuxiang
    Sirejiding, Shalayiding
    Ding, Yue
    Wang, Chunlin
    Lu, Hongtao
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 6375 - 6385
  • [50] Multi-Task Learning with Personalized Transformer for Review Recommendation
    Wang, Haiming
    Liu, Wei
    Yin, Jian
    WEB INFORMATION SYSTEMS ENGINEERING - WISE 2021, PT II, 2021, 13081 : 162 - 176