Transient anisotropic kernel for probabilistic learning on manifolds

被引:0
|
作者
Soize, Christian [1 ]
Ghanem, Roger [2 ]
机构
[1] Univ Gustave Eiffel, MSME, UMR 8208, 5 Bd Descartes, F-77454 Marne La Vallee, France
[2] Univ Southern Calif, 210 KAP Hall, Los Angeles, CA 90089 USA
关键词
Transient kernel; Probabilistic learning; PLoM; Diffusion maps; Fokker-Planck operator; Spectrum; FINITE-ELEMENT; DIFFUSION MAPS; SCHRODINGER-EQUATION; INVERSE PROBLEMS; REPRESENTATION; INFORMATION; REDUCTION;
D O I
10.1016/j.cma.2024.117453
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
PLoM (Probabilistic Learning on Manifolds) is a method introduced in 2016 for handling small training datasets by projecting an It & ocirc; equation from a stochastic dissipative Hamiltonian dynamical system, acting as the MCMC generator, for which the KDE-estimated probability measure with the training dataset is the invariant measure. PLoM performs a projection on a reduced-order vector basis related to the training dataset, using the diffusion maps (DMAPS) basis constructed with a time-independent isotropic kernel. In this paper, we propose a new ISDE projection vector basis built from a transient anisotropic kernel, providing an alternative to the DMAPS basis to improve statistical surrogates for stochastic manifolds with heterogeneous data. The construction ensures that for times near the initial time, the DMAPS basis coincides with the transient basis. For larger times, the differences between the two bases are characterized by the angle of their spanned vector subspaces. The optimal instant yielding the optimal transient basis is determined using an estimation of mutual information from Information Theory, which is normalized by the entropy estimation to account for the effects of the number of realizations used in the estimations. Consequently, this new vector basis better represents statistical dependencies in the learned probability measure for any dimension. Three applications with varying levels of statistical complexity and data heterogeneity validate the proposed theory, showing that the transient anisotropic kernel improves the learned probability measure.
引用
收藏
页数:38
相关论文
共 50 条
  • [31] Online learning of probabilistic appearance manifolds for video-based recognition and tracking
    Lee, KC
    Kriegman, D
    2005 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 1, PROCEEDINGS, 2005, : 852 - 859
  • [32] Probabilistic Learning on Manifolds (PLoM) for cross-scale diagnostics in structural dynamics
    Zeng, Xiaoshu
    Gencturk, Bora
    Ezvan, Olivier
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2025, 435
  • [33] Probabilistic expression analysis on manifolds
    Chang, Y
    Hu, CB
    Turk, M
    PROCEEDINGS OF THE 2004 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 2, 2004, : 520 - 527
  • [34] Probabilistic Tracking on Riemannian Manifolds
    Wu, Yi
    Wu, Bo
    Liu, Jia
    Lu, Hanqing
    19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 229 - +
  • [35] Anisotropic Gaussian kernel adaptive filtering by Lie-group dictionary learning
    Wada, Tomoya
    Fukumori, Kosuke
    Tanaka, Toshihisa
    Fiori, Simone
    PLOS ONE, 2020, 15 (08):
  • [36] Probabilistic kernel regression models
    Jaakkola, TS
    Haussler, D
    ARTIFICIAL INTELLIGENCE AND STATISTICS 99, PROCEEDINGS, 1999, : 94 - 102
  • [37] Probabilistic Mercer kernel clusters
    Yang, ZR
    PROCEEDINGS OF THE 2005 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS AND BRAIN, VOLS 1-3, 2005, : 1885 - 1890
  • [38] Anisotropic adaptive kernel deconvolution
    Comte, F.
    Lacour, C.
    ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES, 2013, 49 (02): : 569 - 609
  • [39] THE METAPLECTIC KERNEL FOR ANISOTROPIC GROUPS
    RAPINCHUK, AS
    DOKLADY AKADEMII NAUK BELARUSI, 1985, 29 (12): : 1068 - 1071
  • [40] Kernel density estimation on Riemannian manifolds
    Pelletier, B
    STATISTICS & PROBABILITY LETTERS, 2005, 73 (03) : 297 - 304