Global Sensitivity Analysis in Load Modeling via Low-Rank Tensor

被引:13
|
作者
Lin, You [1 ,2 ]
Wang, Yishen [3 ]
Wang, Jianhui [2 ]
Wang, Siqi [3 ]
Shi, Di [3 ]
机构
[1] GEIRI North Amer, AI & Syst Analyt Grp, San Jose, CA 95134 USA
[2] Southern Methodist Univ, Dept Elect & Comp Engn, Dallas, TX 75205 USA
[3] GEIRI North Amer, San Jose, CA 95134 USA
关键词
Load modeling; Tensile stress; Computational modeling; Mathematical model; Parameter estimation; Voltage measurement; Reactive power; Dimensionality reduction; load modeling; parameter estimation; sensitivity analysis; tensor;
D O I
10.1109/TSG.2020.2978769
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Growing model complexities in load modeling have created high dimensionality in parameter estimations, and thereby substantially increasing associated computational costs. In this letter, a tensor-based method is proposed for identifying composite load modeling (CLM) parameters and for conducting a global sensitivity analysis. Tensor format and Fokker-Planck equations are used to estimate the power output response of CLM in the context of simultaneously varying parameters under their full parameter distribution ranges. The proposed tensor structure is shown as effective for tackling high-dimensional parameter estimation and for improving computational performances in load modeling through global sensitivity analysis.
引用
收藏
页码:2737 / 2740
页数:4
相关论文
共 50 条
  • [1] Global sensitivity analysis using low-rank tensor approximations
    Konakli, Katerina
    Sudret, Bruno
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2016, 156 : 64 - 83
  • [2] Power Grid Faults Classification via Low-Rank Tensor Modeling
    Repasky, Matthew
    Xie, Yao
    Zhang, Yichen
    Qiu, Feng
    FIFTY-SEVENTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, IEEECONF, 2023, : 849 - 853
  • [3] Noisy Tensor Completion via Low-Rank Tensor Ring
    Qiu, Yuning
    Zhou, Guoxu
    Zhao, Qibin
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1127 - 1141
  • [4] Online Robust Low-Rank Tensor Modeling for Streaming Data Analysis
    Li, Ping
    Feng, Jiashi
    Jin, Xiaojie
    Zhang, Luming
    Xu, Xianghua
    Yan, Shuicheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (04) : 1061 - 1075
  • [5] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [6] Low-Rank Tensor Modeling of Room Impulse Responses
    Jalmby, Martin
    Elvander, Filip
    van Waterschoot, Toon
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 111 - 115
  • [7] Robust Low-Rank Tensor Completion Based on Tensor Ring Rank via,&epsilon
    Li, Xiao Peng
    So, Hing Cheung
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 3685 - 3698
  • [8] Low-Rank Tensor Completion via Tensor Joint Rank With Logarithmic Composite Norm
    Zhang, Hongbing
    Zheng, Bing
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2025, 32 (02)
  • [9] Tensor Completion via Nonlocal Low-Rank Regularization
    Xie, Ting
    Li, Shutao
    Fang, Leyuan
    Liu, Licheng
    IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (06) : 2344 - 2354
  • [10] Tensor Recovery via Nonconvex Low-Rank Approximation
    Chen, Lin
    Jiang, Xue
    Liu, Xingzhao
    Zhou, Zhixin
    28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 710 - 714