Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data

被引:998
|
作者
Donoho, DL [1 ]
Grimes, C [1 ]
机构
[1] Stanford Univ, Dept Stat, Stanford, CA 94305 USA
关键词
manifold learning; ISOMAP; tangent coordinates; isometry; Laplacian eigenmaps;
D O I
10.1073/pnas.1031596100
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
We describe a method for recovering the underlying parametrization of scattered data (m(i)) lying on a manifold M embedded in high-dimensional Euclidean space. The method, Hessian-based locally linear embedding, derives from a conceptual framework of local isometry in which the manifold M, viewed as a Riemannian submanifold of the ambient Euclidean space R-n, is locally isometric to an open, connected subset Theta of Euclidean space R-d. Because Theta does not have to be convex, this framework is able to handle a significantly wider class of situations than the original ISOMAP algorithm. The theoretical framework revolves around a quadratic form H(f) = integral(M)parallel toH(f)(m)parallel to(F)(2)dm defined on functions f : M --> R. Here Hf denotes the Hessian of f, and H(f) averages the Frobenius norm of the Hessian over M. To define the Hessian, we use orthogonal coordinates on the tangent planes of M. The key observation is that, if M truly is locally isometric to an open, connected subset of R-d, then H(f) has a (d + 1)-dimensional null space consisting of the constant functions and a d-dimensional space of functions spanned by the original isometric coordinates. Hence, the isometric coordinates can be recovered up to a linear isometry. Our method may be viewed as a modification of locally linear embedding and our theoretical framework as a modification of the Laplacian eigen-maps framework, where we substitute a quadratic form based on the Hessian in place of one based on the Laplacian.
引用
收藏
页码:5591 / 5596
页数:6
相关论文
共 50 条
  • [1] Robust Hessian Locally Linear Embedding Techniques for High-Dimensional Data
    Xing, Xianglei
    Du, Sidan
    Wang, Kejun
    ALGORITHMS, 2016, 9 (02):
  • [2] Incremental Hessian Locally Linear Embedding algorithm
    Abdel-Mannan, Osama
    Ben Hamza, A.
    Youssef, Amr
    2007 9TH INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND ITS APPLICATIONS, VOLS 1-3, 2007, : 484 - 487
  • [3] Robust locally nonlinear embedding (RLNE) for dimensionality reduction of high-dimensional data with noise
    Xu, Yichen
    Li, Eric
    NEUROCOMPUTING, 2024, 596
  • [4] Deep Recursive Embedding for High-Dimensional Data
    Zhou, Zixia
    Zu, Xinrui
    Wang, Yuanyuan
    Lelieveldt, Boudewijn P. F.
    Tao, Qian
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2022, 28 (02) : 1237 - 1248
  • [5] Hessian Locally Linear Embedding of PMU Data for Efficient Fault Detection in Power Systems
    Liu, Guohong
    Li, Xiaomeng
    Wang, Cong
    Chen, Zhe
    Chen, Ruonan
    Qiu, Robert C.
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [6] Locally fitting hyperplanes to high-dimensional data
    M. Hou
    C. Kambhampati
    Neural Computing and Applications, 2022, 34 : 8885 - 8896
  • [7] Locally fitting hyperplanes to high-dimensional data
    Hou, M.
    Kambhampati, C.
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (11): : 8885 - 8896
  • [8] Using Hessian Locally Linear Embedding for autonomic failure prediction
    Lu, Xu
    Wang, Huiqiang
    Zhou, Renjie
    Ge, Baoyu
    2009 WORLD CONGRESS ON NATURE & BIOLOGICALLY INSPIRED COMPUTING (NABIC 2009), 2009, : 771 - 775
  • [9] High-Dimensional Function Approximation Using Local Linear Embedding
    Andras, Peter
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [10] Locally differentially private high-dimensional data synthesis
    Chen, Xue
    Wang, Cheng
    Yang, Qing
    Hu, Teng
    Jiang, Changjun
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (01)