Enhancing Multimodal Patterns in Neuroimaging by Siamese Neural Networks with Self-Attention Mechanism

被引:14
|
作者
Arco, Juan E. [1 ,2 ,3 ]
Ortiz, Andres [2 ,3 ]
Gallego-Molina, Nicolas J. [2 ,3 ]
Gorriz, Juan M. [1 ,3 ]
Ramirez, Javier [1 ,3 ]
机构
[1] Univ Granada, Dept Signal Theory Networking & Commun, Granada 18010, Spain
[2] Univ Malaga, Dept Commun Engn, Malaga 29010, Spain
[3] Andalusian Res Inst Data Sci & Computat Intellige, Granada, Spain
关键词
Multimodal combination; siamese neural network; self-attention; deep learning; medical imaging; ALZHEIMERS-DISEASE; FUNCTIONAL CONNECTIVITY; MATTER LOSS; DIAGNOSIS; FUSION; MULTISCALE; MRI;
D O I
10.1142/S0129065723500193
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The combination of different sources of information is currently one of the most relevant aspects in the diagnostic process of several diseases. In the field of neurological disorders, different imaging modalities providing structural and functional information are frequently available. Those modalities are usually analyzed separately, although a joint of the features extracted from both sources can improve the classification performance of Computer-Aided Diagnosis (CAD) tools. Previous studies have computed independent models from each individual modality and combined them in a subsequent stage, which is not an optimum solution. In this work, we propose a method based on the principles of siamese neural networks to fuse information from Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET). This framework quantifies the similarities between both modalities and relates them with the diagnostic label during the training process. The resulting latent space at the output of this network is then entered into an attention module in order to evaluate the relevance of each brain region at different stages of the development of Alzheimer's disease. The excellent results obtained and the high flexibility of the method proposed allow fusing more than two modalities, leading to a scalable methodology that can be used in a wide range of contexts.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] An Improved Siamese Tracking Network Based On Self-Attention And Cross-Attention
    Lai Yijun
    Song Jianmei
    She Haoping
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 466 - 470
  • [32] Multiple Protein Subcellular Locations Prediction Based on Deep Convolutional Neural Networks with Self-Attention Mechanism
    Cong, Hanhan
    Liu, Hong
    Cao, Yi
    Chen, Yuehui
    Liang, Cheng
    INTERDISCIPLINARY SCIENCES-COMPUTATIONAL LIFE SCIENCES, 2022, 14 (02) : 421 - 438
  • [33] Multiple Protein Subcellular Locations Prediction Based on Deep Convolutional Neural Networks with Self-Attention Mechanism
    Hanhan Cong
    Hong Liu
    Yi Cao
    Yuehui Chen
    Cheng Liang
    Interdisciplinary Sciences: Computational Life Sciences, 2022, 14 : 421 - 438
  • [34] Probabilistic Matrix Factorization Recommendation of Self-Attention Mechanism Convolutional Neural Networks With Item Auxiliary Information
    Zhang, Chenkun
    Wang, Cheng
    IEEE ACCESS, 2020, 8 (08): : 208311 - 208321
  • [35] Piecewise convolutional neural network relation extraction with self-attention mechanism
    Zhang, Bo
    Xu, Li
    Liu, Ke-Hao
    Yang, Ru
    Li, Mao-Zhen
    Guo, Xiao-Yang
    PATTERN RECOGNITION, 2025, 159
  • [36] Global Convolutional Neural Networks With Self-Attention for Fisheye Image Rectification
    Kim, Byunghyun
    Lee, Dohyun
    Min, Kyeongyuk
    Chong, Jongwha
    Joe, Inwhee
    IEEE ACCESS, 2022, 10 : 129580 - 129587
  • [37] Lipschitz Normalization for Self-Attention Layers with Application to Graph Neural Networks
    Dasoulas, George
    Scaman, Kevin
    Virmaux, Aladin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [38] Adaptive Feature Self-Attention in Spiking Neural Networks for Hyperspectral Classification
    Li, Heng
    Tu, Bing
    Liu, Bo
    Li, Jun
    Plaza, Antonio
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63
  • [39] Combining convolutional neural networks and self-attention for fundus diseases identification
    Wang, Keya
    Xu, Chuanyun
    Li, Gang
    Zhang, Yang
    Zheng, Yu
    Sun, Chengjie
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [40] ARCHITECTURE SELF-ATTENTION MECHANISM: NONLINEAR OPTIMIZATION FOR NEURAL ARCHITECTURE SEARCH
    Hao, Jie
    Zhu, William
    JOURNAL OF NONLINEAR AND VARIATIONAL ANALYSIS, 2021, 5 (01): : 119 - 140