Global Sensitivity Analysis in Load Modeling via Low-Rank Tensor

被引:13
|
作者
Lin, You [1 ,2 ]
Wang, Yishen [3 ]
Wang, Jianhui [2 ]
Wang, Siqi [3 ]
Shi, Di [3 ]
机构
[1] GEIRI North Amer, AI & Syst Analyt Grp, San Jose, CA 95134 USA
[2] Southern Methodist Univ, Dept Elect & Comp Engn, Dallas, TX 75205 USA
[3] GEIRI North Amer, San Jose, CA 95134 USA
关键词
Load modeling; Tensile stress; Computational modeling; Mathematical model; Parameter estimation; Voltage measurement; Reactive power; Dimensionality reduction; load modeling; parameter estimation; sensitivity analysis; tensor;
D O I
10.1109/TSG.2020.2978769
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Growing model complexities in load modeling have created high dimensionality in parameter estimations, and thereby substantially increasing associated computational costs. In this letter, a tensor-based method is proposed for identifying composite load modeling (CLM) parameters and for conducting a global sensitivity analysis. Tensor format and Fokker-Planck equations are used to estimate the power output response of CLM in the context of simultaneously varying parameters under their full parameter distribution ranges. The proposed tensor structure is shown as effective for tackling high-dimensional parameter estimation and for improving computational performances in load modeling through global sensitivity analysis.
引用
收藏
页码:2737 / 2740
页数:4
相关论文
共 50 条
  • [31] Low-rank tensor completion via combined non-local self-similarity and low-rank regularization
    Li, Xiao-Tong
    Zhao, Xi-Le
    Jiang, Tai-Xiang
    Zheng, Yu-Bang
    Ji, Teng-Yu
    Huang, Ting-Zhu
    NEUROCOMPUTING, 2019, 367 : 1 - 12
  • [32] Denoising of low-dose CT images via low-rank tensor modeling and total variation regularization
    Sagheer, Sameera V. Mohd
    George, Sudhish N.
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2019, 94 : 1 - 17
  • [33] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [34] NONPARAMETRIC LOW-RANK TENSOR IMPUTATION
    Bazerque, Juan Andres
    Mateos, Gonzalo
    Giannakis, Georgios B.
    2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 876 - 879
  • [35] Low-Rank Regression with Tensor Responses
    Rabusseau, Guillaume
    Kadri, Hachem
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [36] MULTIRESOLUTION LOW-RANK TENSOR FORMATS
    Mickelin, Oscar
    Karaman, Sertac
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2020, 41 (03) : 1086 - 1114
  • [37] Low-rank tensor completion via tensor tri-factorization and sparse transformation
    Yang, Fanyin
    Zheng, Bing
    Zhao, Ruijuan
    SIGNAL PROCESSING, 2025, 233
  • [38] LOW-RANK TENSOR HUBER REGRESSION
    Wei, Yangxin
    Luot, Ziyan
    Chen, Yang
    PACIFIC JOURNAL OF OPTIMIZATION, 2022, 18 (02): : 439 - 458
  • [39] Low-Rank Tensor MMSE Equalization
    Ribeiro, Lucas N.
    de Almeida, Andre L. F.
    Mota, Joao C. M.
    2019 16TH INTERNATIONAL SYMPOSIUM ON WIRELESS COMMUNICATION SYSTEMS (ISWCS), 2019, : 511 - 516
  • [40] Low-Rank Tensor Completion via Tensor Nuclear Norm With Hybrid Smooth Regularization
    Zhao, Xi-Le
    Nie, Xin
    Zheng, Yu-Bang
    Ji, Teng-Yu
    Huang, Ting-Zhu
    IEEE ACCESS, 2019, 7 : 131888 - 131901