Prediction of dengue cases using the attention-based long short-term memory (LSTM) approach

被引:3
|
作者
Majeed, Mokhalad A. [1 ]
Shafri, Helmi Z. M. [1 ,2 ]
Wayayok, Aimrun [3 ]
Zulkafli, Zed [1 ]
机构
[1] Univ Putra Malaysia UPM, Dept Civil Engn, Fac Engn, Serdang, Malaysia
[2] Univ Putra Malaysia, Geospatial Informat Sci Res Ctr GISRC, Fac Engn, Serdang, Malaysia
[3] Univ Putra Malaysia, Dept Biol & Agr Engn, Fac Engn, Serdang, Malaysia
关键词
dengue fever; LSTM; attention; deep learning; Malaysia; OUTBREAKS;
D O I
10.4081/gh.2023.1176
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
This research proposes a 'temporal attention' addition for long-short term memory (LSTM) models for dengue prediction. The number of monthly dengue cases was collected for each of five Malaysian states i.e. Selangor, Kelantan, Johor, Pulau Pinang, and Melaka from 2011 to 2016. Climatic, demographic, geographic and temporal attributes were used as covariates. The proposed LSTM models with temporal attention was compared with several benchmark models including a linear support vector machine (LSVM), a radial basis function support vector machine (RBFSVM), a decision tree (DT), a shallow neural network (SANN) and a deep neural network (D-ANN). In addition, experiments were conducted to analyze the impact of look-back settings on each model performance. The results showed that the attention LSTM (A-LSTM) model performed best, with the stacked, attention LSTM (SA-LSTM) one in second place. The LSTM and stacked LSTM (S-LSTM) models performed almost identically but with the accuracy improved by the attention mechanism was added. Indeed, they were both found to be superior to the benchmark models mentioned above. The best results were obtained when all attributes were included in the model. The four models (LSTM, S-LSTM, A-LSTM and SA-LSTM) were able to accurately predict dengue presence 1-6 months ahead. Our findings provide a more accurate dengue prediction model than previously used, with the prospect of also applying this approach in other geographic areas.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] ALSTM: An attention-based long short-term memory framework for knowledge base reasoning
    Wang, Qi
    Hao, Yongsheng
    NEUROCOMPUTING, 2020, 399 : 342 - 351
  • [22] Image Captioning with Bidirectional Semantic Attention-Based Guiding of Long Short-Term Memory
    Pengfei Cao
    Zhongyi Yang
    Liang Sun
    Yanchun Liang
    Mary Qu Yang
    Renchu Guan
    Neural Processing Letters, 2019, 50 : 103 - 119
  • [23] Wind Speed Prediction and Visualization Using Long Short-Term Memory Networks (LSTM)
    Ehsan, Amimul
    Shahirinia, Amir
    Zhang, Nian
    Oladunni, Timothy
    2020 10TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY (ICIST), 2020, : 234 - 240
  • [24] Well performance prediction based on Long Short-Term Memory (LSTM) neural network
    Huang, Ruijie
    Wei, Chenji
    Wang, Baohua
    Yang, Jian
    Xu, Xin
    Wu, Suwei
    Huang, Suqi
    JOURNAL OF PETROLEUM SCIENCE AND ENGINEERING, 2022, 208
  • [25] Deep learning for quality prediction of nonlinear dynamic processes with variable attention-based long short-term memory network
    Yuan, Xiaofeng
    Li, Lin
    Wang, Yalin
    Yang, Chunhua
    Gui, Weihua
    CANADIAN JOURNAL OF CHEMICAL ENGINEERING, 2020, 98 (06): : 1377 - 1389
  • [26] Improved Plasma Etch Endpoint Detection Using Attention-Based Long Short-Term Memory Machine Learning
    Kim, Ye Jin
    Song, Jung Ho
    Cho, Ki Hwan
    Shin, Jong Hyeon
    Kim, Jong Sik
    Yoon, Jung Sik
    Hong, Sang Jeen
    ELECTRONICS, 2024, 13 (17)
  • [27] Early Churn User Classification in Social Networking Service Using Attention-Based Long Short-Term Memory
    Sato, Koya
    Oka, Mizuki
    Kato, Kazuhiko
    TRENDS AND APPLICATIONS IN KNOWLEDGE DISCOVERY AND DATA MINING: PAKDD 2019 WORKSHOPS, 2019, 11607 : 45 - 56
  • [28] Modeling citation worthiness by using attention-based bidirectional long short-term memory networks and interpretable models
    Tong Zeng
    Daniel E. Acuna
    Scientometrics, 2020, 124 : 399 - 428
  • [29] Sarcasm Detection Using Soft Attention-Based Bidirectional Long Short-Term Memory Model With Convolution Network
    Le Hoang Son
    Kumar, Akshi
    Sangwan, Saurabh Raj
    Arora, Anshika
    Nayyar, Anand
    Abdel-Basset, Mohamed
    IEEE ACCESS, 2019, 7 : 23319 - 23328
  • [30] Modeling citation worthiness by using attention-based bidirectional long short-term memory networks and interpretable models
    Zeng, Tong
    Acuna, Daniel E.
    SCIENTOMETRICS, 2020, 124 (01) : 399 - 428