Time-distanced gates in long short-term memory networks

被引:20
|
作者
Gao, Riqiang [1 ]
Tang, Yucheng [1 ]
Xu, Kaiwen [1 ]
Huo, Yuankai [1 ]
Bao, Shunxing [1 ]
Antic, Sanja L. [2 ]
Epstein, Emily S. [2 ]
Deppen, Steve [3 ]
Paulson, Alexis B. [4 ]
Sandler, Kim L. [4 ]
Massion, Pierre P. [2 ]
Landman, Bennett A. [1 ,4 ]
机构
[1] Vanderbilt Univ, Elect Engn & Comp Sci, Nashville, TN 37235 USA
[2] Vanderbilt Univ, Sch Med, Med, Nashville, TN 37235 USA
[3] Vanderbilt Univ, Med Ctr, Thorac Surg, Nashville, TN 37235 USA
[4] Vanderbilt Univ, Med Ctr, Radiol, Nashville, TN 37235 USA
关键词
Lung cancer diagnosis; Longitudinal; Distanced LSTM; Temporal Emphasis Model;
D O I
10.1016/j.media.2020.101785
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Long Short-Term Memory (LSTM) network is widely used in modeling sequential observations in fields ranging from natural language processing to medical imaging. The LSTM has shown promise for interpreting computed tomography (CT) in lung screening protocols. Yet, traditional image-based LSTM models ignore interval differences, while recently proposed interval-modeled LSTM variants are limited in their ability to interpret temporal proximity. Meanwhile, clinical imaging acquisition may be irregularly sampled, and such sampling patterns may be commingled with clinical usages. In this paper, we propose the Distanced L STM (DL STM) by introducing time-distanced (i.e., time distance to the last scan) gates with a temporal emphasis model (TEM) targeting at lung cancer diagnosis (i.e., evaluating the malignancy of pulmonary nodules) . Briefly, (1) the time distance of every scan to the last scan is modeled explicitly, (2) time-distanced input and forget gates in DLSTM are introduced across regular and irregular sampling sequences, and (3) the newer scan in serial data is emphasized by the TEM. The DLSTM algorithm is evaluated with both simulated data and real CT images (from 1794 National Lung Screening Trial (NLST) patients with longitudinal scans and 1420 clinical studied patients). Experimental results on simulated data indicate the DLSTM can capture families of temporal relationships that cannot be detected with traditional LSTM. Cross-validation on empirical CT datasets demonstrates that DLSTM achieves leading performance on both regularly and irregularly sampled data (e.g., improving LSTM from 0.6785 to 0.7085 on F1 score in NLST). In external-validation on irregularly acquired data, the benchmarks achieved 0.8350 (CNN feature) and 0.8380 (with LSTM) on AUC score, while the proposed DLSTM achieves 0.8905. In conclusion, the DLSTM approach is shown to be compatible with families of linear, quadratic, exponential, and log-exponential temporal models. The DLSTM can be readily extended with other temporal dependence interactions while hardly increasing overall model complexity. (c) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Distanced LSTM: Time-Distanced Gates in Long Short-Term Memory Models for Lung Cancer Detection
    Gao, Riqiang
    Huo, Yuankai
    Bao, Shunxing
    Tang, Yucheng
    Antic, Sanja L.
    Epstein, Emily S.
    Balar, Aneri B.
    Deppen, Steve
    Paulson, Alexis B.
    Sandler, Kim L.
    Massion, Pierre P.
    Landman, Bennett A.
    MACHINE LEARNING IN MEDICAL IMAGING (MLMI 2019), 2019, 11861 : 310 - 318
  • [2] Conditioning and time representation in long short-term memory networks
    Francois Rivest
    John F. Kalaska
    Yoshua Bengio
    Biological Cybernetics, 2014, 108 : 23 - 48
  • [3] Conditioning and time representation in long short-term memory networks
    Rivest, Francois
    Kalaska, John F.
    Bengio, Yoshua
    BIOLOGICAL CYBERNETICS, 2014, 108 (01) : 23 - 48
  • [4] On the Initialization of Long Short-Term Memory Networks
    Ghazi, Mostafa Mehdipour
    Nielsen, Mads
    Pai, Akshay
    Modat, Marc
    Cardoso, M. Jorge
    Ourselin, Sebastien
    Sorensen, Lauge
    NEURAL INFORMATION PROCESSING (ICONIP 2019), PT I, 2019, 11953 : 275 - 286
  • [5] Evolving Long Short-Term Memory Networks
    Neto, Vicente Coelho Lobo
    Passos, Leandro Aparecido
    Papa, Joao Paulo
    COMPUTATIONAL SCIENCE - ICCS 2020, PT II, 2020, 12138 : 337 - 350
  • [6] Short-Term Traffic Prediction Using Long Short-Term Memory Neural Networks
    Abbas, Zainab
    Al-Shishtawy, Ahmad
    Girdzijauskas, Sarunas
    Vlassov, Vladimir
    2018 IEEE INTERNATIONAL CONGRESS ON BIG DATA (IEEE BIGDATA CONGRESS), 2018, : 57 - 65
  • [7] Diagnosing Dysarthria with Long Short-Term Memory Networks
    Mayle, Alex
    Mou, Zhiwei
    Bunescu, Razvan
    Mirshekarian, Sadegh
    Xu, Li
    Liu, Chang
    INTERSPEECH 2019, 2019, : 4514 - 4518
  • [8] Short-term traffic travel time forecasting using ensemble approach based on long short-term memory networks
    Jia, Xingli
    Zhou, Wuxiao
    Yang, Hongzhi
    Li, Shuangqing
    Chen, Xingpeng
    IET INTELLIGENT TRANSPORT SYSTEMS, 2023, 17 (06) : 1262 - 1273
  • [9] Molecular Design With Long Short-Term Memory Networks
    Grisoni, Francesca
    Schneider, Gisbert
    JOURNAL OF COMPUTER AIDED CHEMISTRY, 2019, 20 : 35 - 42
  • [10] Long Short Term Memory Networks for Short-Term Electric Load Forecasting
    Narayan, Apurva
    Hipel, Keith W.
    2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 2573 - 2578