Predicting Water Temperature Dynamics of Unmonitored Lakes With Meta-Transfer Learning

被引:46
|
作者
Willard, Jared D. [1 ,2 ]
Read, Jordan S. [2 ]
Appling, Alison P. [2 ]
Oliver, Samantha K. [2 ]
Jia, Xiaowei [3 ]
Kumar, Vipin [1 ]
机构
[1] Univ Minnesota, Dept Comp Sci & Engn, Minneapolis, MN 55455 USA
[2] US Geol Survey, Middleton, WI 53562 USA
[3] Univ Pittsburgh, Dept Comp Sci, Pittsburgh, PA 15260 USA
基金
美国国家科学基金会;
关键词
lake temperature; meta learning; transfer learning; physics-guided deep learning; machine learning; water resources; TEMPORAL COHERENCE; NEURAL-NETWORKS; UNITED-STATES; SENSOR DATA; FUTURE; CLASSIFICATION; MODELS; STRATIFICATION; WISCONSIN; HYDROLOGY;
D O I
10.1029/2021WR029579
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Most environmental data come from a minority of well-monitored sites. An ongoing challenge in the environmental sciences is transferring knowledge from monitored sites to unmonitored sites. Here, we demonstrate a novel transfer-learning framework that accurately predicts depth-specific temperature in unmonitored lakes (targets) by borrowing models from well-monitored lakes (sources). This method, meta-transfer learning (MTL), builds a meta-learning model to predict transfer performance from candidate source models to targets using lake attributes and candidates' past performance. We constructed source models at 145 well-monitored lakes using calibrated process-based (PB) modeling and a recently developed approach called process-guided deep learning (PGDL). We applied MTL to either PB or PGDL source models (PB-MTL or PGDL-MTL, respectively) to predict temperatures in 305 target lakes treated as unmonitored in the Upper Midwestern United States. We show significantly improved performance relative to the uncalibrated PB General Lake Model, where the median root mean squared error (RMSE) for the target lakes is 2.52 degrees C. PB-MTL yielded a median RMSE of 2.43 degrees C; PGDL-MTL yielded 2.16 degrees C; and a PGDL-MTL ensemble of nine sources per target yielded 1.88 degrees C. For sparsely monitored target lakes, PGDL-MTL often outperformed PGDL models trained on the target lakes themselves. Differences in maximum depth between the source and target were consistently the most important predictors. Our approach readily scales to thousands of lakes in the Midwestern United States, demonstrating that MTL with meaningful predictor variables and high-quality source models is a promising approach for many kinds of unmonitored systems and environmental variables.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] SelfMTL: Self-Supervised Meta-Transfer Learning via Contrastive Representation for Hyperspectral Target Detection
    Luo, Fulin
    Shi, Shanshan
    Qin, Kai
    Guo, Tan
    Fu, Chuan
    Lin, Zhiping
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63
  • [42] Predicting shallow water dynamics using echo-state networks with transfer learning
    Chen, Xiaoqian
    Nadiga, Balasubramanya T.
    Timofeyev, Ilya
    GEM-INTERNATIONAL JOURNAL ON GEOMATHEMATICS, 2022, 13 (01)
  • [43] Predicting shallow water dynamics using echo-state networks with transfer learning
    Xiaoqian Chen
    Balasubramanya T. Nadiga
    Ilya Timofeyev
    GEM - International Journal on Geomathematics, 2022, 13
  • [44] A new meta-transfer learning method with freezing operation for few-shot bearing fault diagnosis
    Wang, Peiqi
    Li, Jingde
    Wang, Shubei
    Zhang, Fusheng
    Shi, Juanjuan
    Shen, Changqing
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2023, 34 (07)
  • [45] Meta-transfer Learning for Massive MIMO Channel Estimation for Millimeter-Wave Outdoor Vehicular Environments
    Tolba, Bassant
    Abd El-Malek, Ahmed H.
    Abo-Zahhad, Mohammed
    Elsabrouty, Maha
    2023 IEEE 20TH CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE, CCNC, 2023,
  • [46] Evaluating deep learning architecture and data assimilation for improving water temperature forecasts at unmonitored locations
    Zwart, Jacob A.
    Diaz, Jeremy
    Hamshaw, Scott
    Oliver, Samantha
    Ross, Jesse C.
    Sleckman, Margaux
    Appling, Alison P.
    Corson-Dosch, Hayley
    Jia, Xiaowei
    Read, Jordan
    Sadler, Jeffrey
    Thompson, Theodore
    Watkins, David
    White, Elaheh
    FRONTIERS IN WATER, 2023, 5
  • [47] Efficient framework for low-resource abstractive summarization by meta-transfer learning and pointer-generator networks
    Huh, Taehun
    Ko, Youngjoong
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 234
  • [48] Meta-transfer learning-based method for multi-fault analysis and assessment in power system
    Zheng, Lingfeng
    Zhu, Yuhong
    Zhou, Yongzhi
    APPLIED INTELLIGENCE, 2024, 54 (23) : 12112 - 12127
  • [49] A multi-head attention network with adaptive meta-transfer learning for RUL prediction of rocket engines
    Pan, Tongyang
    Chen, Jinglong
    Ye, Zhisheng
    Li, Aimin
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2022, 225
  • [50] Good Meta-tasks Make A Better Cross-lingual Meta-transfer Learning for Low-resource Languages
    Wu, Linjuan
    Guo, Zongyi
    Cui, Baoliang
    Tang, Haihong
    Lu, Weiming
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 7431 - 7446