Improving Recurrent Neural Network Responsiveness to Acute Clinical Events

被引:1
|
作者
Ledbetter, David R. [1 ]
Laksana, Eugene [1 ]
Aczon, Melissa [1 ]
Wetzel, Randall [1 ]
机构
[1] Childrens Hosp Los Angeles, Laura P & Leland K Whittier Virtual Pediat Intens, Los Angeles, CA 90027 USA
来源
IEEE ACCESS | 2021年 / 9卷
关键词
Electronic medical records; health and safety; Kalman filters; long short-term memory; machine learning; performance evaluation; predictive models; real time systems; recurrent neural networks; time series analysis; PEDIATRIC RISK; MORTALITY; PREDICTION; INDEX; MODEL;
D O I
10.1109/ACCESS.2021.3099996
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Predictive models in acute care settings must immediately recognize precipitous changes in a patient's status when presented with data reflecting such changes. Recurrent neural networks (RNN) have become popular for clinical decision support models but exhibit a delayed response to acute events. New information must propagate through the RNN's cell state before the total impact is reflected in the model's predictions. Input data perseveration is a method to train more responsive RNN-based models. Input data is replicated k times during training and deployment. Each replication propagates through the cell state and output of the RNN, but only the output at the final replication is maintained and broadcast as the prediction for evaluation. De-identified Electronic Medical Records (EMR) of 12; 826 patients admitted to a tertiary care pediatric academic center between 01/2009-02/2019 were analyzed. A baseline Long Short-Term Memory (LSTM) model (k = 1), four LSTMs with increasing amounts of input data perseveration (k = 2 to k = 5), and an LSTM with an attention mechanism were trained to predict ICU-mortality. Performance of models was compared using Area Under the Receiver Operating Characteristic Curve (AUROC) after increasing periods of observation from one to 12 hours. The average variation of the change in predicted mortality immediately following defined acute events measured responsiveness. The AUROC gains due to input perseveration were larger at the earlier times of prediction (<= 6 hours), increasing at the first hour from 0:77 with no input data perseveration to 0:83 when k = 5. An LSTM with k = 5 was 2-3 times more responsive to acute events than a baseline LSTM.
引用
收藏
页码:106140 / 106151
页数:12
相关论文
共 50 条
  • [41] Automatic Classification of Clinical MRI Stroke Datasets With a Recurrent Convolutional Neural Network
    Liu, Yichuan
    Hancock, Brandon L.
    Hoang, Tri
    Etherton, Mark R.
    Mocking, Steven J.
    McIntosh, Elissa C.
    Irie, Robert E.
    Bouts, Mark J.
    Broderick, Joseph P.
    Cole, John W.
    Donahue, Kathleen L.
    Giese, Anne-Katrin
    Giralt-Steinhauer, Eva
    Jimenez-Conde, Jordi
    Jern, Christina
    Kittner, Steven J.
    Kleindorfer, Dawn
    Lemmens, Robin
    McArdle, Patrick F.
    Meschia, James F.
    Lindgren, Arne G.
    Rosand, Jonathan
    Rundek, Tatjana
    Sacco, Ralph L.
    Schirmer, Markus D.
    Schmidt, Reinhold
    Sharma, Pankaj
    Slowik, Agnieszka
    Thijs, Vincent
    Wasselius, Johan
    Worrall, Bradford B.
    Rost, Natalia S.
    Wu, Ona
    STROKE, 2020, 51
  • [42] Terminologies augmented recurrent neural network model for clinical named entity recognition
    Lerner, Ivan
    Paris, Nicolas
    Tannier, Xavier
    JOURNAL OF BIOMEDICAL INFORMATICS, 2020, 102
  • [43] The impact of extraneous features on the performance of recurrent neural network models in clinical tasks
    Laksana, Eugene
    Aczon, Melissa
    Ho, Long
    Carlin, Cameron
    Ledbetter, David
    Wetzel, Randall
    JOURNAL OF BIOMEDICAL INFORMATICS, 2020, 102
  • [44] Interpretable Predictions of Clinical Outcomes with An Attention-based Recurrent Neural Network
    Sha, Ying
    Wang, May D.
    ACM-BCB' 2017: PROCEEDINGS OF THE 8TH ACM INTERNATIONAL CONFERENCE ON BIOINFORMATICS, COMPUTATIONAL BIOLOGY,AND HEALTH INFORMATICS, 2017, : 233 - 240
  • [45] Simple recurrent neural network:: A neural network structure for control systems
    Hernández, RP
    Gallegos, JA
    Reyes, JAH
    NEUROCOMPUTING, 1998, 23 (1-3) : 277 - 289
  • [46] IMPROVING END-TO-END SPEECH SYNTHESIS WITH LOCAL RECURRENT NEURAL NETWORK ENHANCED TRANSFORMER
    Zheng, Yibin
    Li, Xinhui
    Xie, Fenglong
    Lu, Li
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 6734 - 6738
  • [47] Improving Text Classification in Agricultural Expert Systems with a Bidirectional Encoder Recurrent Convolutional Neural Network
    Guo, Xiaojuan
    Wang, Jianping
    Gao, Guohong
    Li, Li
    Zhou, Junming
    Li, Yancui
    ELECTRONICS, 2024, 13 (20)
  • [48] Ensemble of Transformer and Convolutional Recurrent Neural Network for Improving Discrimination Accuracy in Automatic Chord Recognition
    Yamaga, Hikaru
    Momma, Toshifumi
    Kojima, Kazunori
    Itoh, Yoshiaki
    2023 ASIA PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, APSIPA ASC, 2023, : 2299 - 2305
  • [49] Improving the Gating Mechanism of Recurrent Neural Networks
    Gu, Albert
    Gulcehre, Caglar
    Paine, Tom
    Hoffman, Matt
    Pascanu, Razvan
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [50] Improving the Gating Mechanism of Recurrent Neural Networks
    Gu, Albert
    Gulcehre, Caglar
    Paine, Tom
    Hoffman, Matt
    Pascanu, Razvan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119