CCBLA: a Lightweight Phishing Detection Model Based on CNN, BiLSTM, and Attention Mechanism

被引:10
|
作者
Zhu, Erzhou [1 ]
Yuan, Qixiang [1 ]
Chen, Zhile [1 ]
Li, Xuejian [1 ]
Fang, Xianyong [1 ]
机构
[1] Anhui Univ, Key Lab Intelligent Comp & Signal Proc, Minist Educ, Sch Comp Sci & Technol, Hefei 230601, Peoples R China
关键词
Phishing detection; Deep learning; Neural network; Attention mechanism; FEATURE-SELECTION;
D O I
10.1007/s12559-022-10024-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Phishing, in which social engineering techniques such as emails and instant messaging are employed and malicious links are disguised as normal URLs to steal sensitive information, is currently a major threat to networks worldwide. Phishing detection systems generally adopt feature engineering as one of the most important approaches to detect or even prevent phishing attacks. However, the accuracy of feature engineering systems is heavily dependent on the prior knowledge of features. In addition, extracting comprehensive features from different dimensions for high detection accuracy is time-consuming. To address these issues, this paper proposes a lightweight model that combines convolutional neural network (CNN), bi-directional long short-term memory (BiLSTM), and the attention mechanism for phishing detection. The proposed model, called the char-convolutional and BiLSTM with attention mechanism (CCBLA) model, employs deep learning to automatically extract features from target URLs and uses the attention mechanism to weight the importance of the selected features under different roles during phishing detection. The results of experiments conducted on two datasets with different scales show that CCBLA is accurate in phishing attack detection with minimal time consumption.
引用
收藏
页码:1320 / 1333
页数:14
相关论文
共 50 条
  • [21] A novel ensemble model for fall detection: leveraging CNN and BiLSTM with channel and temporal attention
    Sahni, Sarita
    Jain, Sweta
    Saritha, Sri Khetwat
    AUTOMATIKA, 2025, 66 (02) : 103 - 116
  • [22] PM2.5 Concentration Prediction Based on CNN-BiLSTM and Attention Mechanism
    Zhang, Jinsong
    Peng, Yongtao
    Ren, Bo
    Li, Taoying
    ALGORITHMS, 2021, 14 (07)
  • [23] Music Audio Sentiment Classification Based on CNN-BiLSTM and Attention Model
    Chen Zhen
    Liu Changhui
    2021 4TH INTERNATIONAL CONFERENCE ON ROBOTICS, CONTROL AND AUTOMATION ENGINEERING (RCAE 2021), 2021, : 156 - 160
  • [24] A Prediction Method of Consumer Buying Behavior Based on Attention Mechanism and CNN-BiLSTM
    Wang, Jian-Nan
    Cui, Jian-Feng
    Chen, Chin-Ling
    Journal of Network Intelligence, 2022, 7 (02): : 375 - 385
  • [25] COSMIC-2 RFI Prediction Model Based on CNN-BiLSTM-Attention for Interference Detection and Location
    Song, Cheng-Long
    Jin, Rui-Min
    Han, Chao
    Wang, Dan-Dan
    Guo, Ya-Ping
    Cui, Xiang
    Wang, Xiao-Ni
    Bai, Pei-Rui
    Zhen, Wei-Min
    SENSORS, 2024, 24 (23)
  • [26] Plant disease detection based on lightweight CNN model
    Liu, Yang
    Gao, Guoqin
    Zhang, Zhenhui
    2021 4TH INTERNATIONAL CONFERENCE ON INFORMATION AND COMPUTER TECHNOLOGIES (ICICT 2021), 2021, : 64 - 68
  • [27] Tool Wear Detection Based on Improved CNN-BiLSTM Model
    Liu H.
    Zhang S.
    Li J.
    Luan X.
    Zhongguo Jixie Gongcheng/China Mechanical Engineering, 2022, 33 (16): : 1940 - 1947and1956
  • [28] Attention-based BiLSTM fused CNN with gating mechanism model for Chinese long text classi fi cation
    Deng, Jianfeng
    Cheng, Lianglun
    Wang, Zhuowei
    COMPUTER SPEECH AND LANGUAGE, 2021, 68
  • [29] Research on EEG emotion recognition based on CNN+BiLSTM+self-attention model
    Li, Xueqing
    Li, Penghai
    Fang, Zhendong
    Cheng, Longlong
    Wang, Zhiyong
    Wang, Weijie
    OPTOELECTRONICS LETTERS, 2023, 19 (08) : 506 - 512
  • [30] Research on EEG emotion recognition based on CNN+BiLSTM+self-attention model
    Xueqing Li
    Penghai Li
    Zhendong Fang
    Longlong Cheng
    Zhiyong Wang
    Weijie Wang
    Optoelectronics Letters, 2023, 19 : 506 - 512