Maximal Linear Embedding for Dimensionality Reduction

被引:50
|
作者
Wang, Ruiping [1 ]
Shan, Shiguang [2 ]
Chen, Xilin [2 ]
Chen, Jie [3 ]
Gao, Wen [4 ]
机构
[1] Tsinghua Univ, Dept Automat, Broadband Network & Multimedia Lab, Beijing 100084, Peoples R China
[2] Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing 100190, Peoples R China
[3] Univ Oulu, Dept Elect & Informat Engn, Machine Vis Grp, FI-90014 Oulu, Finland
[4] Peking Univ, Sch EECS, Key Lab Machine Percept MoE, Beijing 100871, Peoples R China
关键词
Dimensionality reduction; manifold learning; maximal linear patch; landmarks-based global alignment; INTRINSIC DIMENSIONALITY; COMPONENT ANALYSIS; MANIFOLD; FACE; PROJECTION; EIGENMAPS; FRAMEWORK; ALIGNMENT;
D O I
10.1109/TPAMI.2011.39
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Over the past few decades, dimensionality reduction has been widely exploited in computer vision and pattern analysis. This paper proposes a simple but effective nonlinear dimensionality reduction algorithm, named Maximal Linear Embedding (MLE). MLE learns a parametric mapping to recover a single global low-dimensional coordinate space and yields an isometric embedding for the manifold. Inspired by geometric intuition, we introduce a reasonable definition of locally linear patch, Maximal Linear Patch (MLP), which seeks to maximize the local neighborhood in which linearity holds. The input data are first decomposed into a collection of local linear models, each depicting an MLP. These local models are then aligned into a global coordinate space, which is achieved by applying MDS to some randomly selected landmarks. The proposed alignment method, called Landmarks-based Global Alignment (LGA), can efficiently produce a closed-form solution with no risk of local optima. It just involves some small-scale eigenvalue problems, while most previous aligning techniques employ time-consuming iterative optimization. Compared with traditional methods such as ISOMAP and LLE, our MLE yields an explicit modeling of the intrinsic variation modes of the observation data. Extensive experiments on both synthetic and real data indicate the effectivity and efficiency of the proposed algorithm.
引用
收藏
页码:1776 / 1792
页数:17
相关论文
共 50 条
  • [41] Locally Minimizing Embedding and Globally Maximizing Variance: Unsupervised Linear Difference Projection for Dimensionality Reduction
    Wan, Minghua
    Lai, Zhihui
    Jin, Zhong
    NEURAL PROCESSING LETTERS, 2011, 33 (03) : 267 - 282
  • [42] Shrinkage-divergence-proximity locally linear embedding algorithm for dimensionality reduction of hyperspectral image
    罗琴
    田铮
    赵志祥
    Chinese Optics Letters, 2008, (08) : 558 - 560
  • [43] Locally Minimizing Embedding and Globally Maximizing Variance: Unsupervised Linear Difference Projection for Dimensionality Reduction
    Minghua Wan
    Zhihui Lai
    Zhong Jin
    Neural Processing Letters, 2011, 33 : 267 - 282
  • [44] Predicting User Preferences of Dimensionality Reduction Embedding Quality
    Morariu C.
    Bibal A.
    Cutura R.
    Frenay B.
    Sedlmair M.
    IEEE Transactions on Visualization and Computer Graphics, 2023, 29 (01) : 745 - 755
  • [45] Positive Semi-definite Embedding for Dimensionality Reduction and
    Fanuel, Michael
    Aspeel, Antoine
    Delvenne, Jean -Charles
    Suykens, Johan A. K.
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2022, 4 (01): : 153 - 178
  • [46] Sparsity and Geometry Preserving Graph Embedding for Dimensionality Reduction
    Gou, Jianping
    Yi, Zhang
    Zhang, David
    Zhan, Yongzhao
    Shen, Xiangjun
    Du, Lan
    IEEE ACCESS, 2018, 6 : 75748 - 75766
  • [47] Graph embedding and extensions: A general framework for dimensionality reduction
    Yan, Shuicheng
    Xu, Dong
    Zhang, Benyu
    Zhang, Hong-Jiang
    Yang, Qiang
    Lin, Stephen
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2007, 29 (01) : 40 - 51
  • [48] A Particle Swarm Embedding Algorithm for Nonlinear Dimensionality Reduction
    Kramer, Oliver
    SWARM INTELLIGENCE (ANTS 2012), 2012, 7461 : 1 - 12
  • [49] Low-rank and sparse embedding for dimensionality reduction
    Han, Na
    Wu, Jigang
    Liang, Yingyi
    Fang, Xiaozhao
    Wong, Wai Keung
    Teng, Shaohua
    NEURAL NETWORKS, 2018, 108 : 202 - 216
  • [50] Dimensionality reduction for acoustic vehicle classification with spectral embedding
    Sunu, Justin
    Percus, Allon G.
    2018 IEEE 15TH INTERNATIONAL CONFERENCE ON NETWORKING, SENSING AND CONTROL (ICNSC), 2018,