Evaluation of a transformer-based model for the temporal forecast of coarse particulate matter (PMCO) concentrations

被引:0
|
作者
Mauricio-Alvarez, Luis Eduardo [1 ]
Aceves-Fernandez, Marco Antonio [1 ]
Pedraza-Ortega, Jesus Carlos [1 ]
Ramos-Arreguin, Juan Manuel [1 ]
机构
[1] Autonomous Univ Queretaro, Fac Engn, Cerro Campanas, Queretaro 76010, Queretaro, Mexico
关键词
Deep learning; Forecasting; Transformer; Air pollution; PMCO; NEURAL-NETWORK;
D O I
10.1007/s12145-024-01330-6
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Accurate forecasting of coarse particulate matter (PMCO) concentrations is crucial for mitigating health risks and environmental impacts in urban areas. This study evaluates the performance of a transformer-based deep learning model for predicting PMCO levels using 2022 data from four monitoring stations (BJU, MER, TLA, UIZ) in Mexico City. The transformer model's forecasting accuracy is assessed for horizons of 12, 24, 48, and 72 hours ahead and compared against conventional autoregressive integrated moving average (ARIMA) and long short-term memory (LSTM) models. Error metrics including root mean square error (RMSE), mean absolute error (MAE), and mean absolute percentage error (MAPE) are employed for evaluation. Results demonstrate the transformer model's superior performance, achieving the lowest error values across multiple stations and prediction horizons. However, challenges are identified for short-term forecasts and sites near industrial areas with high PMCO variability. The study highlights the transformer model's potential for accurate PMCO forecasting while underscoring the need for interdisciplinary approaches to address complex air pollution dynamics in urban environments.
引用
收藏
页码:3095 / 3110
页数:16
相关论文
共 50 条
  • [41] A Swin Transformer-based model for mosquito species identification
    De-zhong Zhao
    Xin-kai Wang
    Teng Zhao
    Hu Li
    Dan Xing
    He-ting Gao
    Fan Song
    Guo-hua Chen
    Chun-xiao Li
    Scientific Reports, 12
  • [42] A Novel Transformer-Based Model for Dialog State Tracking
    Miao, Yu
    Liu, Kuilong
    Yang, Wenbo
    Yang, Changyuan
    CROSS-CULTURAL DESIGN-APPLICATIONS IN BUSINESS, COMMUNICATION, HEALTH, WELL-BEING, AND INCLUSIVENESS, CCD 2022, PT III, 2022, 13313 : 148 - 156
  • [43] DLGNet: A Transformer-based Model for Dialogue Response Generation
    Olabiyi, Oluwatobi
    Mueller, Erik T.
    NLP FOR CONVERSATIONAL AI, 2020, : 54 - 62
  • [44] AN EFFICIENT TRANSFORMER-BASED MODEL FOR VOICE ACTIVITY DETECTION
    Zhao, Yifei
    Champagne, Benoit
    2022 IEEE 32ND INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2022,
  • [45] TRANSQL: A Transformer-based Model for Classifying SQL Queries
    Tahmasebi, Shirin
    Payberah, Amir H.
    Paragraph, Ahmet Soylu
    Roman, Dumitru
    Matskin, Mihhail
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 788 - 793
  • [46] Learning Daily Human Mobility with a Transformer-Based Model
    Wang, Weiying
    Osaragi, Toshihiro
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2024, 13 (02)
  • [47] A Transformer-based Audio Captioning Model with Keyword Estimation
    Koizumi, Yuma
    Masumura, Ryo
    Nishida, Kyosuke
    Yasuda, Masahiro
    Saito, Shoichiro
    INTERSPEECH 2020, 2020, : 1977 - 1981
  • [48] Transformer-based heart language model with electrocardiogram annotations
    Tudjarski, Stojancho
    Gusev, Marjan
    Kanoulas, Evangelos
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [49] LVBERT: Transformer-Based Model for Latvian Language Understanding
    Znotins, Arturs
    Barzdins, Guntis
    HUMAN LANGUAGE TECHNOLOGIES - THE BALTIC PERSPECTIVE (HLT 2020), 2020, 328 : 111 - 115
  • [50] An Improved Transformer-Based Model for Urban Pedestrian Detection
    Wu, Tianyong
    Li, Xiang
    Dong, Qiuxuan
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2025, 18 (01)