Robust and stochastic sparse subspace clustering

被引:0
|
作者
Zhu, Yanjiao [1 ]
Li, Xinrong [2 ]
Xiu, Xianchao [3 ]
Liu, Wanquan [4 ]
Yin, Chuancun [1 ]
机构
[1] Qufu Normal Univ, Sch Stat & Data Sci, Qufu, Peoples R China
[2] Northeastern Univ, Natl Frontiers Sci Ctr Ind Intelligence & Syst Opt, Shenyang, Peoples R China
[3] Shanghai Univ, Sch Mechatron Engn & Automat, Shanghai, Peoples R China
[4] Sun Yat Sen Univ, Sch Intelligent Syst Engn, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Sparse subspace clustering; Stochastic; Huber function; Proximal alternating minimization; MINIMIZATION; CONVERGENCE; ALGORITHMS; NONCONVEX;
D O I
10.1016/j.neucom.2024.128703
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse subspace clustering (SSC) has been widely employed in machine learning and pattern recognition, but it still faces scalability challenges when dealing with large-scale datasets. Recently, stochastic SSC (SSSC) has emerged as an effective solution by leveraging the dropout technique. However, SSSC cannot robustly handle noise, especially non-Gaussian noise, leading to unsatisfactory clustering performance. To address the above issues, we propose a novel robust and stochastic method called stochastic sparse subspace clustering with the Huber function (S3CH). The key idea is to introduce the Huber surrogate to measure the loss of the stochastic self-expression framework, thus S3CH inherits the advantage of the stochastic framework for large-scale problems while mitigating sensitivity to non-Gaussian noise. In algorithms, an efficient proximal alternating minimization (PAM)-based optimization scheme is developed. In theory, the convergence of the generated sequence is rigorously proved. Extensive numerical experiments on synthetic and six real datasets validate the advantages of the proposed method in clustering accuracy, noise robustness, parameter sensitivity, post-hoc analysis, and model stability.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Robust sparse coding for subspace learning
    School of Three Gorges Artificial Intelligence, Chongqing Three Gorges University, Wanzhou, Chongqing
    404100, China
    Ital. J. Pure Appl. Math., 2020, (986-994):
  • [42] Graph Convolutional Subspace Clustering: A Robust Subspace Clustering Framework for Hyperspectral Image
    Cai, Yaoming
    Zhang, Zijia
    Cai, Zhihua
    Liu, Xiaobo
    Jiang, Xinwei
    Yan, Qin
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2021, 59 (05): : 4191 - 4202
  • [43] Randomly Sketched Sparse Subspace Clustering for Acoustic Scene Clustering
    Li, Shuoyang
    Wang, Wenwu
    2018 26TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2018, : 2489 - 2493
  • [44] Probabilistic Subspace Clustering Via Sparse Representations
    Adler, Amir
    Elad, Michael
    Hel-Or, Yacov
    IEEE SIGNAL PROCESSING LETTERS, 2013, 20 (01) : 63 - 66
  • [45] Identifiability conditions and subspace clustering in sparse BSS
    Georgiev, Pando
    Theis, Fabian
    Ralescul, Anca
    INDEPENDENT COMPONENT ANALYSIS AND SIGNAL SEPARATION, PROCEEDINGS, 2007, 4666 : 357 - +
  • [46] Sparse Subspace Representation for Spectral Document Clustering
    Saha, Budhaditya
    Dinh Phung
    Pham, Duc Son
    Venkatesh, Svetha
    12TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2012), 2012, : 1092 - 1097
  • [47] Sparse subspace clustering via nonconvex approximation
    Wenhua Dong
    Xiao-Jun Wu
    Josef Kittler
    He-Feng Yin
    Pattern Analysis and Applications, 2019, 22 : 165 - 176
  • [48] Graph Connectivity in Noisy Sparse Subspace Clustering
    Wang, Yining
    Wang, Yu-Xiang
    Singh, Aarti
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 538 - 546
  • [49] SPARSE SUBSPACE CLUSTERING WITH MISSING AND CORRUPTED DATA
    Charles, Zachary
    Jalali, Amin
    Willett, Rebecca
    2018 IEEE DATA SCIENCE WORKSHOP (DSW), 2018, : 180 - 184
  • [50] Subspace Clustering with Block Diagonal Sparse Representation
    Xian Fang
    Ruixun Zhang
    Zhengxin Li
    Xiuli Shao
    Neural Processing Letters, 2021, 53 : 4293 - 4312