Objective-sensitive principal component analysis for high-dimensional inverse problems

被引:2
|
作者
Elizarev, Maksim [1 ]
Mukhin, Andrei [1 ]
Khlyupin, Aleksey [1 ]
机构
[1] Moscow Inst Phys & Technol, Ctr Engn & Technol, 9 Institutskiy Per, Dolgoprudnyi 141701, Russia
关键词
Principal component analysis; Dimensionality reduction; Inverse problems; Optimization; History matching; Reservoir simulation; ENSEMBLE KALMAN FILTER; DIFFERENTIABLE PARAMETERIZATION; EFFICIENT; MODEL; REPRESENTATION; MEDIA;
D O I
10.1007/s10596-021-10081-y
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We introduce a novel approach of data-driven dimensionality reduction for solving high-dimensional optimization problems, including history matching. Objective-Sensitive parameterization of the argument accounts for the corresponding change of objective function value. The result is achieved via an extension of the conventional loss function, which only quantifies approximation error over realizations. This paper contains three instances of such an approach based on Principal Component Analysis (PCA). Gradient-Sensitive PCA (GS-PCA) exploits a linear approximation of the objective function. Two other approaches solve the problem approximately within the framework of stationary perturbation theory (SPT). All the algorithms are verified and tested with a synthetic reservoir model. The results demonstrate improvements in parameterization quality regarding the reveal of the unconstrained objective function minimum. Also, we provide possible extensions and analyze the overall applicability of the Objective-Sensitive approach, which can be combined with modern parameterization techniques beyond PCA.
引用
收藏
页码:2019 / 2031
页数:13
相关论文
共 50 条
  • [41] Dynamic principal component CAW models for high-dimensional realized covariance matrices
    Gribisch, Bastian
    Stollenwerk, Michael
    QUANTITATIVE FINANCE, 2020, 20 (05) : 799 - 821
  • [42] High-dimensional Data Classification Based on Principal Component Analysis Dimension Reduction and Improved BP Algorithm
    Yan, Tai-shan
    Wen, Yi-ting
    Li, Wen-bin
    2018 INTERNATIONAL CONFERENCE ON COMMUNICATION, NETWORK AND ARTIFICIAL INTELLIGENCE (CNAI 2018), 2018, : 441 - 445
  • [43] A Robust Outlier Detection Method in High-Dimensional Data Based on Mutual Information and Principal Component Analysis
    Wang, Hanlin
    Li, Zhijian
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT I, ICIC 2024, 2024, 14875 : 270 - 281
  • [44] ANOVA-GP Modeling for High-Dimensional Bayesian Inverse Problems
    Shi, Xiaoyu
    Zhang, Hanyu
    Wang, Guanjie
    MATHEMATICS, 2024, 12 (02)
  • [45] Sampling High-Dimensional Gaussian Distributions for General Linear Inverse Problems
    Orieux, F.
    Feron, O.
    Giovannelli, J. -F.
    IEEE SIGNAL PROCESSING LETTERS, 2012, 19 (05) : 251 - 254
  • [46] HIGH-DIMENSIONAL ANALYSIS OF SEMIDEFINITE RELAXATIONS FOR SPARSE PRINCIPAL COMPONENTS
    Amini, Arash A.
    Wainwright, Martin J.
    ANNALS OF STATISTICS, 2009, 37 (5B): : 2877 - 2921
  • [47] Multi-objective evolutionary fuzzy clustering for high-dimensional problems
    Di Nuovo, Alessandro G.
    Palesi, Maurizio
    Catania, Vincenzo
    2007 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS, VOLS 1-4, 2007, : 1932 - 1937
  • [48] Lagged principal trend analysis for longitudinal high-dimensional data
    Zhang, Yuping
    STAT, 2019, 8 (01):
  • [49] High-dimensional analysis of semidefinite relaxations for sparse principal components
    Amini, Arash A.
    Wainwright, Martin J.
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 2454 - 2458
  • [50] High Dimensional Principal Component Analysis with Contaminated Data
    Xu, Huan
    Caramanis, Constantine
    Mannor, Shie
    ITW: 2009 IEEE INFORMATION THEORY WORKSHOP ON NETWORKING AND INFORMATION THEORY, 2009, : 246 - +