Iterative multi-task learning for time-series modeling of solar panel PV outputs

被引:52
|
作者
Shireen, Tahasin [1 ]
Shao, Chenhui [2 ]
Wang, Hui [1 ]
Li, Jingjing [3 ]
Zhang, Xi [4 ]
Li, Mingyang [5 ]
机构
[1] Florida State Univ, Dept Ind & Mfg Engn, 2525 Pottsdamer St, Tallahassee, FL 32310 USA
[2] Univ Illinois, Dept Mech Sci & Engn, 1206 W Green St, Urbana, IL 61801 USA
[3] Penn State Univ, Dept Ind & Mfg Engn, 310 Leonhard Bldg, University Pk, PA 16802 USA
[4] Peking Univ, Dept Ind Engn & Management, 298 Chengfu Rd, Beijing 100871, Peoples R China
[5] Univ S Florida, Dept Ind & Management Syst Engn, Tampa, FL 33620 USA
基金
美国国家科学基金会;
关键词
Multi-task learning; Time series; Solar panels; Prediction; Forecasting; WAVELET; FORECAST;
D O I
10.1016/j.apenergy.2017.12.058
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
Time-series modeling of PV output for solar panels can help solar panel owners understand the power systems' time-varying behavior and be prepared for the load demand. The time-series forecast/prediction can become challenging due to many missing observations or a lack of historical records that are not sufficient to establish statistical models. Increasing PV measurement frequency over a longer period increases the cost in the detection of the PV fluctuation. This paper proposes an efficient approach to iterative multi-task learning for time series (MTL-GP-TS) that improves prediction of the PV output without increasing measurement efforts by sharing the information among PV data from multiple similar solar panels. The proposed iterative MTL-GP-TS model learns/imputes unobserved or missing values in a dataset of time series associated with the solar panel of interest to predict the PV trend. Additionally, the method improves and generalizes the traditional multi-task learning for Gaussian Process to the learning of both global trend and local irregular components in time series. A real-world case study demonstrated that the proposed method could result in substantial improvement of predictions over conventional approaches. The paper also discusses the selection of parameters and data sources when implementing the proposed algorithm.
引用
收藏
页码:654 / 662
页数:9
相关论文
共 50 条
  • [1] Constructing Time-Series Momentum Portfolios with Deep Multi-Task Learning
    Ong, Joel
    Herremans, Dorien
    SSRN, 2022,
  • [2] A systematic approach to multi-task learning from time-series data
    Mahmoud, Reem A.
    Hajj, Hazem
    Karameh, Fadi N.
    APPLIED SOFT COMPUTING, 2020, 96
  • [3] Multi-task self-supervised time-series representation learning
    Choi, Heejeong
    Kang, Pilsung
    INFORMATION SCIENCES, 2024, 671
  • [4] Constructing time-series momentum portfolios with deep multi-task learning
    Ong, Joel
    Herremans, Dorien
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 230
  • [5] Multi-Task Diffusion Learning for Time Series Classification
    Zheng, Shaoqiu
    Liu, Zhen
    Tian, Long
    Ye, Ling
    Zheng, Shixin
    Peng, Peng
    Chu, Wei
    ELECTRONICS, 2024, 13 (20)
  • [6] Multi-Task Disentangled Autoencoder for Time-Series Data in Glucose Dynamics
    Lim, Min Hyuk
    Cho, Young Min
    Kim, Sungwan
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2022, 26 (09) : 4702 - 4713
  • [7] Multifidelity Surrogate Modeling for Time-Series Outputs
    Kerleguer, Baptiste
    SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2023, 11 (02): : 514 - 539
  • [8] Multi-task Learning Method for Hierarchical Time Series Forecasting
    Yang, Maoxin
    Hu, Qinghua
    Wang, Yun
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: TEXT AND TIME SERIES, PT IV, 2019, 11730 : 474 - 485
  • [9] Bayesian Multi-task Learning for Dynamic Time Series Prediction
    Chandra, Rohitash
    Cripps, Sally
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018, : 390 - 397
  • [10] End-to-end Multi-task Learning of Missing Value Imputation and Forecasting in Time-Series Data
    Kim, Jinhee
    Kim, Taesung
    Choi, Jang-Ho
    Choo, Jaegul
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 8849 - 8856