Feature selection from high-order tensorial data via sparse decomposition

被引:8
|
作者
Wang, Donghui [1 ]
Kong, Shu [1 ]
机构
[1] Zhejiang Univ, Dept Comp Sci & Technol, Hangzhou 310027, Zhejiang, Peoples R China
关键词
Dimensionality reduction; Feature selection; Tensor decomposition; High-order principal component analysis; Sparse principal component analysis;
D O I
10.1016/j.patrec.2012.06.010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal component analysis (PCA) suffers from the fact that each principal component (PC) is a linear combination of all the original variables, thus it is difficult to interpret the results. For this reason, sparse PCA (sPCA), which produces modified PCs with sparse loadings, arises to clear away this interpretation puzzlement. However, as a result of that sPCA is limited in handling vector-represented data, if we use sPCA to reduce the dimensionality and select significant features on the real-world data which are often naturally represented by high-order tensors, we have to reshape them into vectors beforehand, and this will destroy the intrinsic data structures and induce the curse of dimensionality. Focusing on this issue, in this paper, we address the problem to find a set of critical features with multi-directional sparse loadings directly from the tensorial data, and propose a novel method called sparse high-order PCA (sHOPCA) to derive a set of sparse loadings in multiple directions. The computational complexity analysis is also presented to illustrate the efficiency of sHOPCA. To evaluate the proposed sHOPCA, we perform several experiments on both synthetic and real-world datasets, and the experimental results demonstrate the merit of sHOPCA on sparse representation of high-order tensorial data. (c) 2012 Elsevier B.V. All rights reserved.
引用
收藏
页码:1695 / 1702
页数:8
相关论文
共 50 条
  • [1] Sparse feature selection via local feature and high-order label correlation
    Lin Sun
    Yuxuan Ma
    Weiping Ding
    Jiucheng Xu
    Applied Intelligence, 2024, 54 : 565 - 591
  • [2] Sparse feature selection via local feature and high-order label correlation
    Sun, Lin
    Ma, Yuxuan
    Ding, Weiping
    Xu, Jiucheng
    APPLIED INTELLIGENCE, 2024, 54 (01) : 565 - 591
  • [3] A neural tensor decomposition model for high-order sparse data recovery
    Liao, Tianchi
    Yang, Jinghua
    Chen, Chuan
    Zheng, Zibin
    INFORMATION SCIENCES, 2024, 658
  • [4] Assessing high-order effects in feature importance via predictability decomposition
    Ontivero-Ortega, Marlis
    Faes, Luca
    Cortes, Jesus M.
    Marinazzo, Daniele
    Stramaglia, Sebastiano
    PHYSICAL REVIEW E, 2025, 111 (03)
  • [5] Optimal Sparse Singular Value Decomposition for High-Dimensional High-Order Data
    Zhang, Anru
    Han, Rungang
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2019, 114 (528) : 1708 - 1725
  • [6] A Knowledge Graph Recommendation Model via High-order Feature Interaction and Intent Decomposition
    Zhang, Ruoyi
    Ma, Huifang
    Li, Qingfeng
    Wang, Yike
    Li, Zhixin
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [7] Learning spatiotemporal dynamics from sparse data via a high-order physics-encoded network
    Ren, Pu
    Song, Jialin
    Rao, Chengping
    Wang, Qi
    Guo, Yike
    Sun, Hao
    Liu, Yang
    COMPUTER PHYSICS COMMUNICATIONS, 2025, 312
  • [8] High-order covariate interacted Lasso for feature selection
    Zhang, Zhihong
    Tian, Yiyang
    Bai, Lu
    Xiahou, Jianbing
    Hancock, Edwin
    PATTERN RECOGNITION LETTERS, 2017, 87 : 139 - 146
  • [9] High-order conditional mutual information maximization for dealing with high-order dependencies in feature selection
    Souza, Francisco
    Premebida, Cristiano
    Araujo, Rui
    PATTERN RECOGNITION, 2022, 131
  • [10] Unsupervised feature selection with high-order similarity learning
    Mi, Yong
    Chen, Hongmei
    Luo, Chuan
    Horng, Shi-Jinn
    Li, Tianrui
    KNOWLEDGE-BASED SYSTEMS, 2024, 285