Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data

被引:998
|
作者
Donoho, DL [1 ]
Grimes, C [1 ]
机构
[1] Stanford Univ, Dept Stat, Stanford, CA 94305 USA
关键词
manifold learning; ISOMAP; tangent coordinates; isometry; Laplacian eigenmaps;
D O I
10.1073/pnas.1031596100
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
We describe a method for recovering the underlying parametrization of scattered data (m(i)) lying on a manifold M embedded in high-dimensional Euclidean space. The method, Hessian-based locally linear embedding, derives from a conceptual framework of local isometry in which the manifold M, viewed as a Riemannian submanifold of the ambient Euclidean space R-n, is locally isometric to an open, connected subset Theta of Euclidean space R-d. Because Theta does not have to be convex, this framework is able to handle a significantly wider class of situations than the original ISOMAP algorithm. The theoretical framework revolves around a quadratic form H(f) = integral(M)parallel toH(f)(m)parallel to(F)(2)dm defined on functions f : M --> R. Here Hf denotes the Hessian of f, and H(f) averages the Frobenius norm of the Hessian over M. To define the Hessian, we use orthogonal coordinates on the tangent planes of M. The key observation is that, if M truly is locally isometric to an open, connected subset of R-d, then H(f) has a (d + 1)-dimensional null space consisting of the constant functions and a d-dimensional space of functions spanned by the original isometric coordinates. Hence, the isometric coordinates can be recovered up to a linear isometry. Our method may be viewed as a modification of locally linear embedding and our theoretical framework as a modification of the Laplacian eigen-maps framework, where we substitute a quadratic form based on the Hessian in place of one based on the Laplacian.
引用
收藏
页码:5591 / 5596
页数:6
相关论文
共 50 条
  • [21] Robust linear regression for high-dimensional data: An overview
    Filzmoser, Peter
    Nordhausen, Klaus
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2021, 13 (04)
  • [22] Ensemble Linear Subspace Analysis of High-Dimensional Data
    Ahmed, S. Ejaz
    Amiri, Saeid
    Doksum, Kjell
    ENTROPY, 2021, 23 (03)
  • [23] High-dimensional image data feature extraction by double discriminant embedding
    Imani, Maryam
    Ghassemian, Hassan
    PATTERN ANALYSIS AND APPLICATIONS, 2017, 20 (02) : 473 - 484
  • [24] Network-based Clustering and Embedding for High-Dimensional Data Visualization
    Zhang, Hengyuan
    Chen, Xiaowu
    2013 INTERNATIONAL CONFERENCE ON COMPUTER-AIDED DESIGN AND COMPUTER GRAPHICS (CAD/GRAPHICS), 2013, : 290 - 297
  • [25] High-dimensional image data feature extraction by double discriminant embedding
    Maryam Imani
    Hassan Ghassemian
    Pattern Analysis and Applications, 2017, 20 : 473 - 484
  • [26] Discriminant locally linear embedding with high-order tensor data
    Li, Xuelong
    Lin, Stephen
    Yan, Shuicheng
    Xu, Dong
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2008, 38 (02): : 342 - 352
  • [27] Gradient-Enhanced Kriging for High-Dimensional Bayesian Optimization with Linear Embedding
    Cheng, Kai
    Zimmermann, Ralf
    AIAA JOURNAL, 2023, 61 (11) : 4946 - 4959
  • [28] Gradient-Enhanced Kriging for High-Dimensional Bayesian Optimization with Linear Embedding
    Cheng, Kai
    Zimmermann, Ralf
    AIAA Journal, 2023, 61 (11): : 4946 - 4959
  • [29] Visualizing high-dimensional loss landscapes with Hessian directions
    Boettcher, Lucas
    Wheeler, Gregory
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2024, 2024 (02):
  • [30] Unsupervised locally embedded clustering for automatic high-dimensional data labeling
    Fu, Yun
    Huang, Thomas S.
    2007 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL III, PTS 1-3, PROCEEDINGS, 2007, : 1057 - +