Maximal Linear Embedding for Dimensionality Reduction

被引:50
|
作者
Wang, Ruiping [1 ]
Shan, Shiguang [2 ]
Chen, Xilin [2 ]
Chen, Jie [3 ]
Gao, Wen [4 ]
机构
[1] Tsinghua Univ, Dept Automat, Broadband Network & Multimedia Lab, Beijing 100084, Peoples R China
[2] Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing 100190, Peoples R China
[3] Univ Oulu, Dept Elect & Informat Engn, Machine Vis Grp, FI-90014 Oulu, Finland
[4] Peking Univ, Sch EECS, Key Lab Machine Percept MoE, Beijing 100871, Peoples R China
关键词
Dimensionality reduction; manifold learning; maximal linear patch; landmarks-based global alignment; INTRINSIC DIMENSIONALITY; COMPONENT ANALYSIS; MANIFOLD; FACE; PROJECTION; EIGENMAPS; FRAMEWORK; ALIGNMENT;
D O I
10.1109/TPAMI.2011.39
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Over the past few decades, dimensionality reduction has been widely exploited in computer vision and pattern analysis. This paper proposes a simple but effective nonlinear dimensionality reduction algorithm, named Maximal Linear Embedding (MLE). MLE learns a parametric mapping to recover a single global low-dimensional coordinate space and yields an isometric embedding for the manifold. Inspired by geometric intuition, we introduce a reasonable definition of locally linear patch, Maximal Linear Patch (MLP), which seeks to maximize the local neighborhood in which linearity holds. The input data are first decomposed into a collection of local linear models, each depicting an MLP. These local models are then aligned into a global coordinate space, which is achieved by applying MDS to some randomly selected landmarks. The proposed alignment method, called Landmarks-based Global Alignment (LGA), can efficiently produce a closed-form solution with no risk of local optima. It just involves some small-scale eigenvalue problems, while most previous aligning techniques employ time-consuming iterative optimization. Compared with traditional methods such as ISOMAP and LLE, our MLE yields an explicit modeling of the intrinsic variation modes of the observation data. Extensive experiments on both synthetic and real data indicate the effectivity and efficiency of the proposed algorithm.
引用
收藏
页码:1776 / 1792
页数:17
相关论文
共 50 条
  • [21] Regularized Kernel Local Linear Embedding on Dimensionality Reduction for Non-vectorial Data
    Guo, Yi
    Gao, Junbin
    Kwan, Paul W.
    AI 2009: ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2009, 5866 : 240 - +
  • [22] Multiview Locally Linear Embedding for Spectral-Spatial Dimensionality Reduction of Hyperspectral Imagery
    Haochen Ji
    Zongyu Zuo
    IEEE/CAA Journal of Automatica Sinica, 2022, 9 (06) : 1091 - 1094
  • [23] Multiview Locally Linear Embedding for Spectral-Spatial Dimensionality Reduction of Hyperspectral Imagery
    Ji, Haochen
    Zuo, Zongyu
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2022, 9 (06) : 1091 - 1094
  • [24] Robust jointly sparse embedding for dimensionality reduction
    Lai, Zhihui
    Chen, Yudong
    Mo, Dongmei
    Wen, Jiajun
    Kong, Heng
    NEUROCOMPUTING, 2018, 314 : 30 - 38
  • [25] Word Embedding of Dimensionality Reduction for Document Clustering
    Zhu, Pengyu
    Lang, Qi
    Liu, Xiaodong
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 4371 - 4376
  • [26] Stable sparse subspace embedding for dimensionality reduction
    Chen, Li
    Zhou, Shuizheng
    Ma, Jiajun
    KNOWLEDGE-BASED SYSTEMS, 2020, 195
  • [27] Ordinal Embedding: Approximation Algorithms and Dimensionality Reduction
    Badoiu, Mihai
    Demaine, Erik D.
    Hajiaghayi, MohammadTaghi
    Sidiropoulos, Anastasios
    Zadimoghaddam, Morteza
    APPROXIMATION RANDOMIZATION AND COMBINATORIAL OPTIMIZATION: ALGORITHMS AND TECHNIQUES, PROCEEDINGS, 2008, 5171 : 21 - 34
  • [28] Sketching, Embedding, and Dimensionality Reduction for Information Spaces
    Abdullah, Amirali
    Kumar, Ravi
    McGregor, Andrew
    Vassilvitskii, Sergei
    Venkatasubramanian, Suresh
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 948 - 956
  • [29] Approximate Orthogonal Sparse Embedding for Dimensionality Reduction
    Lai, Zhihui
    Wong, Wai Keung
    Xu, Yong
    Yang, Jian
    Zhang, David
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (04) : 723 - 735
  • [30] Dimensionality Reduction by Using Sparse Reconstruction Embedding
    Huang, Shaoli
    Cai, Cheng
    Zhang, Yang
    ADVANCES IN MULTIMEDIA INFORMATION PROCESSING-PCM 2010, PT II, 2010, 6298 : 167 - 178