Modeling task-based fMRI data via deep belief network with neural architecture search

被引:26
|
作者
Qiang, Ning [1 ]
Dong, Qinglin [2 ,3 ]
Zhang, Wei [4 ]
Ge, Bao [1 ]
Ge, Fangfei [2 ,3 ]
Liang, Hongtao [1 ]
Sun, Yifei [1 ]
Gao, Jie [1 ]
Liu, Tianming [2 ,3 ]
机构
[1] Shaanxi Normal Univ, Sch Phys & Informat Technol, Xian, Peoples R China
[2] Univ Georgia, Dept Comp Sci, Cort Architecture Imaging & Discovery Lab, Athens, GA 30602 USA
[3] Univ Georgia, Bioimaging Res Ctr, Athens, GA 30602 USA
[4] Univ Calif San Francisco, Dept Radiol & Biomed Imaging, San Francisco, CA 94143 USA
基金
美国国家科学基金会; 美国国家卫生研究院; 中国国家自然科学基金;
关键词
Task fMRI; Neural architecture search; Deep belief network; Deep learning; Unsupervised learning; SPARSE REPRESENTATION; INFERENCES; SIGNALS;
D O I
10.1016/j.compmedimag.2020.101747
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
It has been shown that deep neural networks are powerful and flexible models that can be applied on fMRI data with superb representation ability over traditional methods. However, a challenge of neural network architecture design has also attracted attention: due to the high dimension of fMRI volume images, the manual process of network model design is very time-consuming and not optimal. To tackle this problem, we proposed an unsupervised neural architecture search (NAS) framework on a deep belief network (DBN) that models volumetric fMRI data, named NAS-DBN. The NAS-DBN framework is based on Particle Swarm Optimization (PSO) where the swarms of neural architectures can evolve and converge to a feasible optimal solution. The experiments showed that the proposed NAS-DBN framework can quickly find a robust architecture of DBN, yielding a hierarchy organization of functional brain networks (FBNs) and temporal responses. Compared with 3 manually designed DBNs, the proposed NAS-DBN has the lowest testing loss of 0.0197, suggesting an overall performance improvement of up to 47.9 %. For each task, the NAS-DBN identified 260 FBNs, including task-specific FBNs and resting state networks (RSN), which have high overlap rates to general linear model (GLM) derived templates and independent component analysis (ICA) derived RSN templates. The average overlap rate of NAS-DBN to GLM on 20 task-specific FBNs is as high as 0.536, indicating a performance improvement of up to 63.9 % in respect of network modeling. Besides, we showed that the NAS-DBN can simultaneously generate temporal responses that resemble the task designs very well, and it was observed that widespread overlaps between FBNs from different layers of NAS-DBN model form a hierarchical organization of FBNs. Our NAS-DBN framework contributes an effective, unsupervised NAS method for modeling volumetric task fMRI data. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] A Rodent Analogue of the Fearful Face Task via Ultrasonic Vocalization Playback: Evidence From Behavioral, Neural, and Task-Based fMRI Analyses
    Honeycutt, Jennifer
    Granata, Lauren
    Demaestri, Camila
    Kulkarni, Praveen
    Ferris, Craig
    Brenhouse, Heather
    NEUROPSYCHOPHARMACOLOGY, 2020, 45 (SUPPL 1) : 283 - 284
  • [32] Optimizing Deep Feedforward Neural Network Architecture: A Tabu Search Based Approach
    Gupta, Tarun Kumar
    Raza, Khalid
    NEURAL PROCESSING LETTERS, 2020, 51 (03) : 2855 - 2870
  • [33] Task-specific feature extraction and classification of fMRI volumes using a deep neural network initialized with a deep belief network: Evaluation using sensorimotor tasks
    Jang, Hojin
    Plis, Sergey M.
    Calhoun, Vince D.
    Lee, Jong-Hwan
    NEUROIMAGE, 2017, 145 : 314 - 328
  • [34] Two for the Price of One? Analyzing Task-based fMRI Data with Resting-State fMRI Methods
    Field, Aaron
    Birn, Rasmus
    RADIOLOGY, 2021, 301 (01) : 185 - 186
  • [35] Student Performance Prediction Using Atom Search Optimization Based Deep Belief Neural Network
    Surenthiran, S.
    Rajalakshmi, R.
    Sujatha, S. S.
    OPTICAL MEMORY AND NEURAL NETWORKS, 2021, 30 (02) : 157 - 171
  • [36] Student Performance Prediction Using Atom Search Optimization Based Deep Belief Neural Network
    S. Surenthiran
    R. Rajalakshmi
    S. S. Sujatha
    Optical Memory and Neural Networks, 2021, 30 : 157 - 171
  • [37] Accelerating Neural Architecture Search via Proxy Data
    Na, Byunggook
    Mok, Jisoo
    Choe, Hyeokjun
    Yoon, Sungroh
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2848 - 2854
  • [38] Efficient spiking neural network design via neural architecture search
    Yan, Jiaqi
    Liu, Qianhui
    Zhang, Malu
    Feng, Lang
    Ma, De
    Li, Haizhou
    Pan, Gang
    NEURAL NETWORKS, 2024, 173
  • [39] Water atom search algorithm-based deep recurrent neural network for the big data classification based on spark architecture
    Dabbu, Murali
    Karuppusamy, Loheswaran
    Pulugu, Dileep
    Vootla, Subba Ramaiah
    Reddyvari, Venkateswar Reddy
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (08) : 2297 - 2312
  • [40] OPTIMIZE CNN MODEL FOR FMRI SIGNAL CLASSIFICATION VIA ADANET-BASED NEURAL ARCHITECTURE SEARCH
    Dai, Haixing
    Ge, Fangfei
    Li, Qing
    Zhang, Wei
    Liu, Tianming
    2020 IEEE 17TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI 2020), 2020, : 1399 - 1403