Transfer learning of pre-trained CNNs on digital transaction fraud detection

被引:0
|
作者
Tekkali, Chandana Gouri [1 ]
Natarajan, Karthika [2 ]
机构
[1] Raghu Engn Coll, Comp Sci Engn, Visakhapatnam, Andhra Pradesh, India
[2] VIT AP Univ, Sch Comp Sci & Engn, Amaravati, Andhra Pradesh, India
关键词
Digital transaction; pre-trained models; convolutional neural networks; transfer learning;
D O I
10.3233/KES-230067
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article proposes an artificial intelligence-empowered and efficient detection approach for customers with Severe Failure in Digital Transactions (SFDT) through a deep transfer network learning approach from discretized fraud data. Presently, the Real-time global payment system is suffered primarily by fraudsters based on customer behavior. For the identification of fraud, scientists used many techniques. However, identifying and tracking the customers infected by the fraud takes a significant amount of time. The proposed study employs pre-trained convolution neural network-based (CNN) architectures to find SFDT. CNN is pre-trained on the various network architectures using fraud data. This article contributed to pre-trained networks with newly developed versions ResNet152, DenseNet201, InceptionNetV4, and EfficientNetB7 by integrating the loss function to minimize the error. We run numerous experiments on large data set of credit payment transactions which are public in nature, to determine the high rate of SFDT with our model by comparing accuracy with other fraud detection methods and also proved best in evaluating minimum loss cost.
引用
收藏
页码:571 / 580
页数:10
相关论文
共 50 条
  • [21] Meta Distant Transfer Learning for Pre-trained Language Models
    Wang, Chengyu
    Pan, Haojie
    Qiu, Minghui
    Yang, Fei
    Huang, Jun
    Zhang, Yin
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9742 - 9752
  • [22] Transfer Learning from Pre-trained BERT for Pronoun Resolution
    Bao, Xingce
    Qiao, Qianqian
    GENDER BIAS IN NATURAL LANGUAGE PROCESSING (GEBNLP 2019), 2019, : 82 - 88
  • [23] TransTailor: Pruning the Pre-trained Model for Improved Transfer Learning
    Liu, Bingyan
    Cai, Yifeng
    Guo, Yao
    Chen, Xiangqun
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8627 - 8634
  • [24] Pre-trained combustion model and transfer learning in thermoacoustic instability
    Qin, Ziyu
    Wang, Xinyao
    Han, Xiao
    Lin, Yuzhen
    Zhou, Yuchen
    PHYSICS OF FLUIDS, 2023, 35 (03)
  • [25] LogME: Practical Assessment of Pre-trained Models for Transfer Learning
    You, Kaichao
    Liu, Yong
    Wang, Jianmin
    Long, Mingsheng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [26] Backdoor Attacks Against Transfer Learning With Pre-Trained Deep Learning Models
    Wang, Shuo
    Nepal, Surya
    Rudolph, Carsten
    Grobler, Marthie
    Chen, Shangyu
    Chen, Tianle
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2022, 15 (03) : 1526 - 1539
  • [27] Detection of Alzheimer's disease using pre-trained deep learning models through transfer learning: a review
    Heenaye-Mamode Khan, Maleika
    Reesaul, Pushtika
    Auzine, Muhammad Muzzammil
    Taylor, Amelia
    ARTIFICIAL INTELLIGENCE REVIEW, 2024, 57 (10)
  • [28] A comprehensive exploration of semantic relation extraction via pre-trained CNNs
    Li, Qing
    Li, Lili
    Wang, Weinan
    Li, Qi
    Zhong, Jiang
    KNOWLEDGE-BASED SYSTEMS, 2020, 194
  • [29] Classification of Regional Food Using Pre-Trained Transfer Learning Models
    Gadhiya, Jeet
    Khatik, Anjali
    Kodinariya, Shruti
    Ramoliya, Dipak
    7th International Conference on Electronics, Communication and Aerospace Technology, ICECA 2023 - Proceedings, 2023, : 1237 - 1241
  • [30] Facial age estimation using pre-trained CNN and transfer learning
    Issam Dagher
    Dany Barbara
    Multimedia Tools and Applications, 2021, 80 : 20369 - 20380