Objective-sensitive principal component analysis for high-dimensional inverse problems

被引:2
|
作者
Elizarev, Maksim [1 ]
Mukhin, Andrei [1 ]
Khlyupin, Aleksey [1 ]
机构
[1] Moscow Inst Phys & Technol, Ctr Engn & Technol, 9 Institutskiy Per, Dolgoprudnyi 141701, Russia
关键词
Principal component analysis; Dimensionality reduction; Inverse problems; Optimization; History matching; Reservoir simulation; ENSEMBLE KALMAN FILTER; DIFFERENTIABLE PARAMETERIZATION; EFFICIENT; MODEL; REPRESENTATION; MEDIA;
D O I
10.1007/s10596-021-10081-y
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We introduce a novel approach of data-driven dimensionality reduction for solving high-dimensional optimization problems, including history matching. Objective-Sensitive parameterization of the argument accounts for the corresponding change of objective function value. The result is achieved via an extension of the conventional loss function, which only quantifies approximation error over realizations. This paper contains three instances of such an approach based on Principal Component Analysis (PCA). Gradient-Sensitive PCA (GS-PCA) exploits a linear approximation of the objective function. Two other approaches solve the problem approximately within the framework of stationary perturbation theory (SPT). All the algorithms are verified and tested with a synthetic reservoir model. The results demonstrate improvements in parameterization quality regarding the reveal of the unconstrained objective function minimum. Also, we provide possible extensions and analyze the overall applicability of the Objective-Sensitive approach, which can be combined with modern parameterization techniques beyond PCA.
引用
收藏
页码:2019 / 2031
页数:13
相关论文
共 50 条
  • [31] New high-dimensional indexing structure based on principal component sorting
    School of Computer Science and Engineering, Xidian Univ., Xi'an 710071, China
    Xi Tong Cheng Yu Dian Zi Ji Shu/Syst Eng Electron, 2006, 12 (1927-1931):
  • [32] High-Dimensional Principal Projections
    Mas, Andre
    Ruymgaart, Frits
    COMPLEX ANALYSIS AND OPERATOR THEORY, 2015, 9 (01) : 35 - 63
  • [33] High-Dimensional Principal Projections
    André Mas
    Frits Ruymgaart
    Complex Analysis and Operator Theory, 2015, 9 : 35 - 63
  • [34] Asymptotic distribution of the LR statistic for equality of the smallest eigenvalues in high-dimensional principal component analysis
    Fujikoshi, Yasunori
    Yamada, Takayuki
    Watanabe, Daisuke
    Sugiyama, Takakazu
    JOURNAL OF MULTIVARIATE ANALYSIS, 2007, 98 (10) : 2002 - 2008
  • [35] CONSISTENCY OF AIC AND BIC IN ESTIMATING THE NUMBER OF SIGNIFICANT COMPONENTS IN HIGH-DIMENSIONAL PRINCIPAL COMPONENT ANALYSIS
    Bai, Zhidong
    Choi, Kwok Pui
    Fujikoshi, Yasunori
    ANNALS OF STATISTICS, 2018, 46 (03): : 1050 - 1076
  • [36] Constrained principal component analysis with stochastically ordered scores for high-dimensional mass spectrometry data
    Hyun, Hyeong Jin
    Kim, Youngrae
    Kim, Sun Jo
    Kim, Joungyeon
    Lim, Johan
    Lim, Dong Kyu
    Kwon, Sung Won
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2021, 216
  • [37] Tensor robust principal component analysis with total generalized variation for high-dimensional data recovery
    Xu, Zhi
    Yang, Jing-Hua
    Wang, Chuan-long
    Wang, Fusheng
    Yan, Xi-hong
    APPLIED MATHEMATICS AND COMPUTATION, 2024, 483
  • [38] Test for high-dimensional outliers with principal component analysis (vol 7, pg 739, 2024)
    Nakayama, Yugo
    Yata, Kazuyoshi
    Aoshima, Makoto
    JAPANESE JOURNAL OF STATISTICS AND DATA SCIENCE, 2025,
  • [39] GEOMETRIC INFERENCE FOR GENERAL HIGH-DIMENSIONAL LINEAR INVERSE PROBLEMS
    Cai, T. Tony
    Liang, Tengyuan
    Rakhlin, Alexander
    ANNALS OF STATISTICS, 2016, 44 (04): : 1536 - 1563
  • [40] Kernel principal component analysis-based Gaussian process regression modelling for high-dimensional reliability analysis
    Zhou, Tong
    Peng, Yongbo
    COMPUTERS & STRUCTURES, 2020, 241