Time-distanced gates in long short-term memory networks

被引:20
|
作者
Gao, Riqiang [1 ]
Tang, Yucheng [1 ]
Xu, Kaiwen [1 ]
Huo, Yuankai [1 ]
Bao, Shunxing [1 ]
Antic, Sanja L. [2 ]
Epstein, Emily S. [2 ]
Deppen, Steve [3 ]
Paulson, Alexis B. [4 ]
Sandler, Kim L. [4 ]
Massion, Pierre P. [2 ]
Landman, Bennett A. [1 ,4 ]
机构
[1] Vanderbilt Univ, Elect Engn & Comp Sci, Nashville, TN 37235 USA
[2] Vanderbilt Univ, Sch Med, Med, Nashville, TN 37235 USA
[3] Vanderbilt Univ, Med Ctr, Thorac Surg, Nashville, TN 37235 USA
[4] Vanderbilt Univ, Med Ctr, Radiol, Nashville, TN 37235 USA
关键词
Lung cancer diagnosis; Longitudinal; Distanced LSTM; Temporal Emphasis Model;
D O I
10.1016/j.media.2020.101785
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Long Short-Term Memory (LSTM) network is widely used in modeling sequential observations in fields ranging from natural language processing to medical imaging. The LSTM has shown promise for interpreting computed tomography (CT) in lung screening protocols. Yet, traditional image-based LSTM models ignore interval differences, while recently proposed interval-modeled LSTM variants are limited in their ability to interpret temporal proximity. Meanwhile, clinical imaging acquisition may be irregularly sampled, and such sampling patterns may be commingled with clinical usages. In this paper, we propose the Distanced L STM (DL STM) by introducing time-distanced (i.e., time distance to the last scan) gates with a temporal emphasis model (TEM) targeting at lung cancer diagnosis (i.e., evaluating the malignancy of pulmonary nodules) . Briefly, (1) the time distance of every scan to the last scan is modeled explicitly, (2) time-distanced input and forget gates in DLSTM are introduced across regular and irregular sampling sequences, and (3) the newer scan in serial data is emphasized by the TEM. The DLSTM algorithm is evaluated with both simulated data and real CT images (from 1794 National Lung Screening Trial (NLST) patients with longitudinal scans and 1420 clinical studied patients). Experimental results on simulated data indicate the DLSTM can capture families of temporal relationships that cannot be detected with traditional LSTM. Cross-validation on empirical CT datasets demonstrates that DLSTM achieves leading performance on both regularly and irregularly sampled data (e.g., improving LSTM from 0.6785 to 0.7085 on F1 score in NLST). In external-validation on irregularly acquired data, the benchmarks achieved 0.8350 (CNN feature) and 0.8380 (with LSTM) on AUC score, while the proposed DLSTM achieves 0.8905. In conclusion, the DLSTM approach is shown to be compatible with families of linear, quadratic, exponential, and log-exponential temporal models. The DLSTM can be readily extended with other temporal dependence interactions while hardly increasing overall model complexity. (c) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] A short-term prediction model of global ionospheric VTEC based on the combination of long short-term memory and convolutional long short-term memory
    Chen, Peng
    Wang, Rong
    Yao, Yibin
    Chen, Hao
    Wang, Zhihao
    An, Zhiyuan
    JOURNAL OF GEODESY, 2023, 97 (05)
  • [42] Short-term Individual Electric Vehicle Charging Behavior Prediction Using Long Short-term Memory Networks
    Khwaja, Ahmed S.
    Venkatesh, Bala
    Anpalagan, Alagan
    2020 IEEE 25TH INTERNATIONAL WORKSHOP ON COMPUTER AIDED MODELING AND DESIGN OF COMMUNICATION LINKS AND NETWORKS (CAMAD), 2020,
  • [43] QUANTUM LONG SHORT-TERM MEMORY
    Chen, Samuel Yen-Chi
    Yoo, Shinjae
    Fang, Yao-Lung L.
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8622 - 8626
  • [44] LIPREADING WITH LONG SHORT-TERM MEMORY
    Wand, Michael
    Koutnik, Jan
    Schmidhuber, Jurgen
    2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 6115 - 6119
  • [45] Associative Long Short-Term Memory
    Danihelka, Ivo
    Wayne, Greg
    Uria, Benigno
    Kalchbrenner, Nal
    Graves, Alex
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [46] Combining fuzzy clustering and improved long short-term memory neural networks for short-term load forecasting
    Liu, Fu
    Dong, Tian
    Liu, Qiaoliang
    Liu, Yun
    Li, Shoutao
    ELECTRIC POWER SYSTEMS RESEARCH, 2024, 226
  • [47] REACTION-TIME IN SHORT-TERM AND LONG-TERM RECOGNITION MEMORY
    RAZEL, CS
    JOURNAL OF PSYCHOLINGUISTIC RESEARCH, 1975, 4 (03) : 279 - 280
  • [48] Early Prediction of Pressure Injury with Long Short-term Memory Networks
    Fang, Xudong
    Wang, Yunfeng
    Maeda, Ryutaro
    Kitayama, Akio
    Takashi, En
    SENSORS AND MATERIALS, 2022, 34 (07) : 2759 - 2769
  • [49] Learning model predictive control with long short-term memory networks
    Terzi, Enrico
    Bonassi, Fabio
    Farina, Marcello
    Scattolini, Riccardo
    INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, 2021, 31 (18) : 8877 - 8896
  • [50] Traffic flow prediction with Long Short-Term Memory Networks (LSTMs)
    Shao, Hongxin
    Soong, Boon-Hee
    PROCEEDINGS OF THE 2016 IEEE REGION 10 CONFERENCE (TENCON), 2016, : 2986 - 2989