Interpretable, highly accurate brain decoding of subtly distinct brain states from functional MRI using intrinsic functional networks and long short-term memory recurrent neural networks

被引:30
|
作者
Li, Hongming [1 ]
Fan, Yong [1 ]
机构
[1] Univ Penn, Perelman Sch Med, Dept Radiol, Ctr Biomed Image Comp & Analyt, Philadelphia, PA 19104 USA
基金
美国国家卫生研究院;
关键词
Brain decoding; Working memory; Intrinsic functional networks; Recurrent neural networks; Long short-term memory; FMRI; CLASSIFICATION; ACTIVATION;
D O I
10.1016/j.neuroimage.2019.116059
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Decoding brain functional states underlying cognitive processes from functional MRI (fMRI) data using multivariate pattern analysis (MVPA) techniques has achieved promising performance for characterizing brain activation patterns and providing neurofeedback signals. However, it remains challenging to decode subtly distinct brain states for individual fMRI data points due to varying temporal durations and dependency among different cognitive processes. In this study, we develop a deep learning based framework for brain decoding by leveraging recent advances in intrinsic functional network modeling and sequence modeling using long short-term memory (LSTM) recurrent neural networks (RNNs). Particularly, subject-specific intrinsic functional networks (FNs) are computed from resting-state fMRI data and are used to characterize functional signals of task fMRI data with a compact representation for building brain decoding models, and LSTM RNNs are adopted to learn brain decoding mappings between functional profiles and brain states. Validation results on fMRI data from the HCP dataset have demonstrated that brain decoding models built on training data using the proposed method could learn discriminative latent feature representations and effectively distinguish subtly distinct working memory tasks of different subjects with significantly higher accuracy than conventional decoding models. Informative FNs of the brain decoding models identified as brain activation patterns of working memory tasks were largely consistent with the literature. The method also obtained promising decoding performance on motor and social cognition tasks. Our results suggest that LSTM RNNs in conjunction with FNs could build interpretable, highly accurate brain decoding models.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] An analysis of Convolutional Long Short-Term Memory Recurrent Neural Networks for gesture recognition
    Tsironi, Eleni
    Barros, Pablo
    Weber, Cornelius
    Wermter, Stefan
    NEUROCOMPUTING, 2017, 268 : 76 - 86
  • [32] Handwriting Recognition with Large Multidimensional Long Short-Term Memory Recurrent Neural Networks
    Voigtlaender, Paul
    Doetsch, Patrick
    Ney, Hermann
    PROCEEDINGS OF 2016 15TH INTERNATIONAL CONFERENCE ON FRONTIERS IN HANDWRITING RECOGNITION (ICFHR), 2016, : 228 - 233
  • [33] Sequence Discriminative Distributed Training of Long Short-Term Memory Recurrent Neural Networks
    Sak, Hasim
    Vinyals, Oriol
    Heigold, Georg
    Senior, Andrew
    McDermott, Erik
    Monga, Rajat
    Mao, Mark
    15TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2014), VOLS 1-4, 2014, : 1209 - 1213
  • [34] A System for Learning Atoms Based on Long Short-Term Memory Recurrent Neural Networks
    Quan, Zhe
    Lin, Xuan
    Wang, Zhi-Jie
    Liu, Yan
    Wang, Fan
    Li, Kenli
    PROCEEDINGS 2018 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2018, : 728 - 733
  • [35] VOICE CONVERSION USING DEEP BIDIRECTIONAL LONG SHORT-TERM MEMORY BASED RECURRENT NEURAL NETWORKS
    Sun, Lifa
    Kang, Shiyin
    Li, Kun
    Meng, Helen
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 4869 - 4873
  • [36] Empirical modeling of ethanol production dynamics using long short-term memory recurrent neural networks
    Sousa F.M.M.
    Fonseca R.R.
    da Silva F.V.
    Bioresource Technology Reports, 2021, 15
  • [37] Workload Prediction using ARIMA Statistical Model and Long Short-Term Memory Recurrent Neural Networks
    Sudhakar, Chapram
    Kumar, A. Revanth
    Reddy, S. Vishal
    Siddartha, Nupa
    2018 INTERNATIONAL CONFERENCE ON COMPUTING, POWER AND COMMUNICATION TECHNOLOGIES (GUCON), 2018, : 600 - 604
  • [38] Early Forecasting of Rice Blast Disease Using Long Short-Term Memory Recurrent Neural Networks
    Kim, Yangseon
    Roh, Jae-Hwan
    Kim, Ha Young
    SUSTAINABILITY, 2018, 10 (01)
  • [39] Identifying behavioural change among drivers using Long Short-Term Memory recurrent neural networks
    Wijnands, Jasper S.
    Thompson, Jason
    Aschwanden, Gideon D. P. A.
    Stevenson, Mark
    TRANSPORTATION RESEARCH PART F-TRAFFIC PSYCHOLOGY AND BEHAVIOUR, 2018, 53 : 34 - 49
  • [40] DC Pulsed Load Transient Classification Using Long Short-Term Memory Recurrent Neural Networks
    Oslebo, Damian
    Corzine, Keith
    Weatherford, Todd
    Maqsood, Atif
    Norton, Matthew
    2019 13TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATION SYSTEMS (ICSPCS), 2019,