Robust and stochastic sparse subspace clustering

被引:0
|
作者
Zhu, Yanjiao [1 ]
Li, Xinrong [2 ]
Xiu, Xianchao [3 ]
Liu, Wanquan [4 ]
Yin, Chuancun [1 ]
机构
[1] Qufu Normal Univ, Sch Stat & Data Sci, Qufu, Peoples R China
[2] Northeastern Univ, Natl Frontiers Sci Ctr Ind Intelligence & Syst Opt, Shenyang, Peoples R China
[3] Shanghai Univ, Sch Mechatron Engn & Automat, Shanghai, Peoples R China
[4] Sun Yat Sen Univ, Sch Intelligent Syst Engn, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Sparse subspace clustering; Stochastic; Huber function; Proximal alternating minimization; MINIMIZATION; CONVERGENCE; ALGORITHMS; NONCONVEX;
D O I
10.1016/j.neucom.2024.128703
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse subspace clustering (SSC) has been widely employed in machine learning and pattern recognition, but it still faces scalability challenges when dealing with large-scale datasets. Recently, stochastic SSC (SSSC) has emerged as an effective solution by leveraging the dropout technique. However, SSSC cannot robustly handle noise, especially non-Gaussian noise, leading to unsatisfactory clustering performance. To address the above issues, we propose a novel robust and stochastic method called stochastic sparse subspace clustering with the Huber function (S3CH). The key idea is to introduce the Huber surrogate to measure the loss of the stochastic self-expression framework, thus S3CH inherits the advantage of the stochastic framework for large-scale problems while mitigating sensitivity to non-Gaussian noise. In algorithms, an efficient proximal alternating minimization (PAM)-based optimization scheme is developed. In theory, the convergence of the generated sequence is rigorously proved. Extensive numerical experiments on synthetic and six real datasets validate the advantages of the proposed method in clustering accuracy, noise robustness, parameter sensitivity, post-hoc analysis, and model stability.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Leaf Clustering Based on Sparse Subspace Clustering
    Ding, Yun
    Yan, Qing
    Zhang, Jing-Jing
    Xun, Li-Na
    Zheng, Chun-Hou
    INTELLIGENT COMPUTING THEORIES AND APPLICATION, ICIC 2016, PT II, 2016, 9772 : 55 - 66
  • [22] Accelerated Stochastic Variance Reduction Gradient Algorithms for Robust Subspace Clustering
    Liu, Hongying
    Yang, Linlin
    Zhang, Longge
    Shang, Fanhua
    Liu, Yuanyuan
    Wang, Lijun
    SENSORS, 2024, 24 (11)
  • [23] Structured Sparse Subspace Clustering: A Joint Affinity Learning and Subspace Clustering Framework
    Li, Chun-Guang
    You, Chong
    Vidal, Rene
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (06) : 2988 - 3001
  • [24] Perturbation Based Sparse Subspace Clustering
    Gurbuz, Ali Cafer
    COMPRESSIVE SENSING VII: FROM DIVERSE MODALITIES TO BIG DATA ANALYTICS, 2018, 10658
  • [25] Attention reweighted sparse subspace clustering
    Wang, Libin
    Wang, Yulong
    Deng, Hao
    Chen, Hong
    PATTERN RECOGNITION, 2023, 139
  • [26] Structural Reweight Sparse Subspace Clustering
    Wang, Ping
    Han, Bing
    Li, Jie
    Gao, Xinbo
    NEURAL PROCESSING LETTERS, 2019, 49 (03) : 965 - 977
  • [27] Structural Reweight Sparse Subspace Clustering
    Ping Wang
    Bing Han
    Jie Li
    Xinbo Gao
    Neural Processing Letters, 2019, 49 : 965 - 977
  • [28] Latent Space Sparse Subspace Clustering
    Patel, Vishal M.
    Hien Van Nguyen
    Vidal, Rene
    2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2013, : 225 - 232
  • [29] Laplacian Embedded Sparse Subspace Clustering
    Yang, Bing
    Ji, Zexuan
    ELEVENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING (ICGIP 2019), 2020, 11373
  • [30] Building Invariances Into Sparse Subspace Clustering
    Xin, Bo
    Wang, Yizhou
    Gao, Wen
    Wipf, David
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2018, 66 (02) : 449 - 462