Time-distanced gates in long short-term memory networks

被引:20
|
作者
Gao, Riqiang [1 ]
Tang, Yucheng [1 ]
Xu, Kaiwen [1 ]
Huo, Yuankai [1 ]
Bao, Shunxing [1 ]
Antic, Sanja L. [2 ]
Epstein, Emily S. [2 ]
Deppen, Steve [3 ]
Paulson, Alexis B. [4 ]
Sandler, Kim L. [4 ]
Massion, Pierre P. [2 ]
Landman, Bennett A. [1 ,4 ]
机构
[1] Vanderbilt Univ, Elect Engn & Comp Sci, Nashville, TN 37235 USA
[2] Vanderbilt Univ, Sch Med, Med, Nashville, TN 37235 USA
[3] Vanderbilt Univ, Med Ctr, Thorac Surg, Nashville, TN 37235 USA
[4] Vanderbilt Univ, Med Ctr, Radiol, Nashville, TN 37235 USA
关键词
Lung cancer diagnosis; Longitudinal; Distanced LSTM; Temporal Emphasis Model;
D O I
10.1016/j.media.2020.101785
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Long Short-Term Memory (LSTM) network is widely used in modeling sequential observations in fields ranging from natural language processing to medical imaging. The LSTM has shown promise for interpreting computed tomography (CT) in lung screening protocols. Yet, traditional image-based LSTM models ignore interval differences, while recently proposed interval-modeled LSTM variants are limited in their ability to interpret temporal proximity. Meanwhile, clinical imaging acquisition may be irregularly sampled, and such sampling patterns may be commingled with clinical usages. In this paper, we propose the Distanced L STM (DL STM) by introducing time-distanced (i.e., time distance to the last scan) gates with a temporal emphasis model (TEM) targeting at lung cancer diagnosis (i.e., evaluating the malignancy of pulmonary nodules) . Briefly, (1) the time distance of every scan to the last scan is modeled explicitly, (2) time-distanced input and forget gates in DLSTM are introduced across regular and irregular sampling sequences, and (3) the newer scan in serial data is emphasized by the TEM. The DLSTM algorithm is evaluated with both simulated data and real CT images (from 1794 National Lung Screening Trial (NLST) patients with longitudinal scans and 1420 clinical studied patients). Experimental results on simulated data indicate the DLSTM can capture families of temporal relationships that cannot be detected with traditional LSTM. Cross-validation on empirical CT datasets demonstrates that DLSTM achieves leading performance on both regularly and irregularly sampled data (e.g., improving LSTM from 0.6785 to 0.7085 on F1 score in NLST). In external-validation on irregularly acquired data, the benchmarks achieved 0.8350 (CNN feature) and 0.8380 (with LSTM) on AUC score, while the proposed DLSTM achieves 0.8905. In conclusion, the DLSTM approach is shown to be compatible with families of linear, quadratic, exponential, and log-exponential temporal models. The DLSTM can be readily extended with other temporal dependence interactions while hardly increasing overall model complexity. (c) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Forecasting container throughput with long short-term memory networks
    Shankar, Sonali
    Ilavarasan, P. Vigneswara
    Punia, Sushil
    Singh, Surya Prakash
    INDUSTRIAL MANAGEMENT & DATA SYSTEMS, 2020, 120 (03) : 425 - 441
  • [32] Time Series-based Spoof Speech Detection Using Long Short-term Memory and Bidirectional Long Short-term Memory
    Mirza, Arsalan R.
    Al-Talabani, Abdulbasit K.
    ARO-THE SCIENTIFIC JOURNAL OF KOYA UNIVERSITY, 2024, 12 (02): : 119 - 129
  • [33] Long-term and short-term memory networks based on forgetting memristors
    Yi Liu
    Ling Chen
    Chuandong Li
    Xin Liu
    Wenhao Zhou
    Ke Li
    Soft Computing, 2023, 27 : 18403 - 18418
  • [34] Deepfake Detection using Capsule Networks and Long Short-Term Memory Networks
    Mehra, Akul
    Spreeuwers, Luuk
    Strisciuglio, Nicola
    VISAPP: PROCEEDINGS OF THE 16TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS - VOL. 4: VISAPP, 2021, : 407 - 414
  • [35] Long-term and short-term memory networks based on forgetting memristors
    Liu, Yi
    Chen, Ling
    Li, Chuandong
    Liu, Xin
    Zhou, Wenhao
    Li, Ke
    SOFT COMPUTING, 2023, 27 (23) : 18403 - 18418
  • [36] Anomaly Detection in ECG Time signals via Deep Long Short-Term Memory Networks
    Chauhan, Sucheta
    Vig, Lovekesh
    PROCEEDINGS OF THE 2015 IEEE INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (IEEE DSAA 2015), 2015, : 834 - 840
  • [37] Performance Analysis of Long Short-Term Memory Predictive Neural Networks on Time Series Data
    Bolboaca, Roland
    Haller, Piroska
    MATHEMATICS, 2023, 11 (06)
  • [38] A short-term voltage stability online prediction method based on graph convolutional networks and long short-term memory networks
    Wang, Guoteng
    Zhang, Zheren
    Bian, Zhipeng
    Xu, Zheng
    INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS, 2021, 127
  • [39] Short-term Load Forecasting with Distributed Long Short-Term Memory
    Dong, Yi
    Chen, Yang
    Zhao, Xingyu
    Huang, Xiaowei
    2023 IEEE POWER & ENERGY SOCIETY INNOVATIVE SMART GRID TECHNOLOGIES CONFERENCE, ISGT, 2023,
  • [40] A short-term prediction model of global ionospheric VTEC based on the combination of long short-term memory and convolutional long short-term memory
    Peng Chen
    Rong Wang
    Yibin Yao
    Hao Chen
    Zhihao Wang
    Zhiyuan An
    Journal of Geodesy, 2023, 97