Improving Recurrent Neural Network Responsiveness to Acute Clinical Events

被引:1
|
作者
Ledbetter, David R. [1 ]
Laksana, Eugene [1 ]
Aczon, Melissa [1 ]
Wetzel, Randall [1 ]
机构
[1] Childrens Hosp Los Angeles, Laura P & Leland K Whittier Virtual Pediat Intens, Los Angeles, CA 90027 USA
来源
IEEE ACCESS | 2021年 / 9卷
关键词
Electronic medical records; health and safety; Kalman filters; long short-term memory; machine learning; performance evaluation; predictive models; real time systems; recurrent neural networks; time series analysis; PEDIATRIC RISK; MORTALITY; PREDICTION; INDEX; MODEL;
D O I
10.1109/ACCESS.2021.3099996
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Predictive models in acute care settings must immediately recognize precipitous changes in a patient's status when presented with data reflecting such changes. Recurrent neural networks (RNN) have become popular for clinical decision support models but exhibit a delayed response to acute events. New information must propagate through the RNN's cell state before the total impact is reflected in the model's predictions. Input data perseveration is a method to train more responsive RNN-based models. Input data is replicated k times during training and deployment. Each replication propagates through the cell state and output of the RNN, but only the output at the final replication is maintained and broadcast as the prediction for evaluation. De-identified Electronic Medical Records (EMR) of 12; 826 patients admitted to a tertiary care pediatric academic center between 01/2009-02/2019 were analyzed. A baseline Long Short-Term Memory (LSTM) model (k = 1), four LSTMs with increasing amounts of input data perseveration (k = 2 to k = 5), and an LSTM with an attention mechanism were trained to predict ICU-mortality. Performance of models was compared using Area Under the Receiver Operating Characteristic Curve (AUROC) after increasing periods of observation from one to 12 hours. The average variation of the change in predicted mortality immediately following defined acute events measured responsiveness. The AUROC gains due to input perseveration were larger at the earlier times of prediction (<= 6 hours), increasing at the first hour from 0:77 with no input data perseveration to 0:83 when k = 5. An LSTM with k = 5 was 2-3 times more responsive to acute events than a baseline LSTM.
引用
收藏
页码:106140 / 106151
页数:12
相关论文
共 50 条
  • [1] Improving Recurrent Neural Network Responsiveness to Acute Clinical Events
    Ledbetter, David R.
    Laksana, Eugene
    Aczon, Melissa
    Wetzel, Randall
    IEEE Access, 2021, 9 : 106140 - 106151
  • [2] Prediction research of cervical cancer clinical events based on recurrent neural network
    Yan, Yufang
    Zhao, Kui
    Cao, Jilong
    Ma, Huimin
    PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE OF INFORMATION AND COMMUNICATION TECHNOLOGY, 2021, 183 : 221 - 229
  • [3] On improving Recurrent Neural Network for Image Classification
    Chandra, B.
    Sharma, Rajesh Kumar
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 1904 - 1907
  • [4] Improving the performance of a recurrent neural network convolutional decoder
    Hueske, Kjaus
    Goetze, Juergen
    Coersmeier, Edmund
    2007 IEEE INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND INFORMATION TECHNOLOGY, VOLS 1-3, 2007, : 476 - +
  • [5] Earthquake events classification using convolutional recurrent neural network
    Ku, Bonhwa
    Kim, Gwantae
    Jang, Su
    Ko, Hanseok
    JOURNAL OF THE ACOUSTICAL SOCIETY OF KOREA, 2020, 39 (06): : 592 - 599
  • [6] Improving Bug Localization with Character-level Convolutional Neural Network and Recurrent Neural Network
    Xiao, Yan
    Keung, Jacky
    2018 25TH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE (APSEC 2018), 2018, : 703 - 704
  • [7] Improving Recurrent Neural Network Performance Using Transfer Entropy
    Obst, Oliver
    Boedecker, Joschka
    Asada, Minoru
    NEURAL INFORMATION PROCESSING: MODELS AND APPLICATIONS, PT II, 2010, 6444 : 193 - +
  • [8] FE-RNN: A fuzzy embedded recurrent neural network for improving interpretability of underlying neural network
    Tan, James Chee Min
    Cao, Qi
    Quek, Chai
    INFORMATION SCIENCES, 2024, 663
  • [9] Arrival times by Recurrent Neural Network for induced seismic events from a permanent network
    Kolar, Petr
    Waheed, Umair bin
    Eisner, Leo
    Matousek, Petr
    FRONTIERS IN BIG DATA, 2023, 6
  • [10] Analog Gated Recurrent Unit Neural Network for Detecting Chewing Events
    Odame, Kofi
    Nyamukuru, Maria
    Shahghasemi, Mohsen
    Bi, Shengjie
    Kotz, David
    IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, 2022, 16 (06) : 1106 - 1115