SGAMF: Sparse Gated Attention-Based Multimodal Fusion Method for Fake News Detection

被引:1
|
作者
Du, Pengfei [1 ]
Gao, Yali [1 ]
Li, Linghui [1 ]
Li, Xiaoyong [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Key Lab Trustworthy Distributed Comp & Serv, Minist Educ, Beijing 100876, Peoples R China
关键词
Sparse gated attention; multimodal fusion; fake news detection;
D O I
10.1109/TBDATA.2024.3414341
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the field of fake news detection, deep learning techniques have emerged as superior performers in recent years. Nevertheless, the majority of these studies primarily concentrate on either unimodal feature-based methodologies or image-text multimodal fusion techniques, with a minimal focus on the fusion of unstructured text features and structured tabular features. In this study, we present SGAMF, a Sparse Gated Attention-based Multimodal Fusion strategy, designed to amalgamate text features and auxiliary features for the purpose of fake news identification. Compared with traditional multimodal fusion methods, SGAMF can effectively balance accuracy and inference time while selecting the most important features. A novel sparse-gated-attention mechanism has been proposed which instigates a shift in text representation conditioned on auxiliary features, thereby selectively filtering out non-essential features. We have further put forward an enhanced ALBERT for the encoding of text features, capable of balancing efficiency and accuracy. To corroborate our methodology, we have developed a multimodal COVID-19 fake news detection dataset. Comprehensive experimental outcomes on this dataset substantiate that our proposed SGAMF delivers competitive performance in comparison to the existing state-of-the-art techniques in terms of accuracy and F-1 score.
引用
收藏
页码:540 / 552
页数:13
相关论文
共 50 条
  • [41] An inter-modal attention-based deep learning framework using unified modality for multimodal fake news, hate speech and offensive language detection
    Ayetiran, Eniafe Festus
    Özgöbek, Özlem
    Information Systems, 2024, 123
  • [42] Escaping the neutralization effect of modality features fusion in multimodal Fake News Detection
    Wang, Bing
    Li, Ximing
    Li, Changchun
    Wang, Shengsheng
    Gao, Wanfu
    INFORMATION FUSION, 2024, 111
  • [43] Text-image multimodal fusion model for enhanced fake news detection
    Lin, Szu-Yin
    Chen, Yen-Chiu
    Chang, Yu-Han
    Lo, Shih-Hsin
    Chao, Kuo-Ming
    SCIENCE PROGRESS, 2024, 107 (04)
  • [44] FMFN: Fine-Grained Multimodal Fusion Networks for Fake News Detection
    Wang, Jingzi
    Mao, Hongyan
    Li, Hongwei
    APPLIED SCIENCES-BASEL, 2022, 12 (03):
  • [45] Research on fake news detection based on CLIP multimodal mechanism
    Xu, Jinzhong
    Zhang, Yujie
    Liu, Weiguang
    PROCEEDINGS OF 2024 3RD INTERNATIONAL CONFERENCE ON CYBER SECURITY, ARTIFICIAL INTELLIGENCE AND DIGITAL ECONOMY, CSAIDE 2024, 2024, : 72 - 79
  • [46] Fake News Detection Based on the Correlation Extension of Multimodal Information
    Li, Yanqiang
    Ji, Ke
    Ma, Kun
    Chen, Zhenxiang
    Zhou, Jin
    Wu, Jun
    WEB AND BIG DATA, PT I, APWEB-WAIM 2022, 2023, 13421 : 443 - 450
  • [47] Multimodal Social Media Fake News Detection Based on 1D-CCNet Attention Mechanism
    Yan, Yuhan
    Fu, Haiyan
    Wu, Fan
    ELECTRONICS, 2024, 13 (18)
  • [48] An attention-based, context-aware multimodal fusion method for sarcasm detection using inter-modality inconsistency
    Li, Yangyang
    Li, Yuelin
    Zhang, Shihuai
    Liu, Guangyuan
    Chen, Yanqiao
    Shang, Ronghua
    Jiao, Licheng
    KNOWLEDGE-BASED SYSTEMS, 2024, 287
  • [49] A multimodal fake news detection model based on crossmodal attention residual and multichannel convolutional neural networks
    Song, Chenguang
    Ning, Nianwen
    Zhang, Yunlei
    Wu, Bin
    INFORMATION PROCESSING & MANAGEMENT, 2021, 58 (01)
  • [50] A Dynamic Weighted Multimodal Fusion Fake Information Detection Method
    Zuo, Lulan
    Zhang, Zhiyong
    Wang, Jian
    Sangaiah, Arun Kumar
    HUMAN-CENTRIC COMPUTING AND INFORMATION SCIENCES, 2024, 14