Incremental multi-view spectral clustering with sparse and connected graph learning

被引:30
|
作者
Yin, Hongwei [1 ,2 ]
Hu, Wenjun [1 ,2 ]
Zhang, Zhao [3 ,4 ]
Lou, Jungang [1 ,2 ]
Miao, Minmin [1 ,2 ]
机构
[1] Huzhou Univ, Sch Informat Engn, Huzhou 313000, Peoples R China
[2] Huzhou Univ, Zhejiang Prov Key Lab Smart Management & Applicat, Huzhou 313000, Peoples R China
[3] Hefei Univ Technol, Minist Educ, Sch Comp Sci & Informat Engn, Hefei 230009, Peoples R China
[4] Hefei Univ Technol, Minist Educ, Key Lab Knowledge Engn Big Data, Hefei 230009, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-view clustering; Incremental clustering; Sparse graph learning; Connected graph learning; Spectral embedding; MATRIX FACTORIZATION;
D O I
10.1016/j.neunet.2021.08.031
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, a lot of excellent multi-view clustering methods have been proposed. Because most of them need to fuse all views at one time, they are infeasible as the number of views increases over time. If the present multi-view clustering methods are employed directly to re-fuse all views at each time, it is too expensive to store all historical views. In this paper, we proposed an efficient incremental multi-view spectral clustering method with sparse and connected graph learning (SCGL). In our method, only one consensus similarity matrix is stored to represent the structural information of all historical views. Once the newly collected view is available, the consensus similarity matrix is reconstructed by learning from its previous version and the current new view. To further improve the incremental multi-view clustering performance, the sparse graph learning and the connected graph learning are integrated into our model, which can not only reduce the noises, but also preserve the correct connections within clusters. Experiments on several multi-view datasets demonstrate that our method is superior to traditional methods in clustering accuracy, and is more suitable to deal with the multi-view clustering with the number of views increasing over time. (C) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页码:260 / 270
页数:11
相关论文
共 50 条
  • [31] Robust Joint Graph Learning for Multi-View Clustering
    He, Yanfang
    Yusof, Umi Kalsom
    IEEE TRANSACTIONS ON BIG DATA, 2025, 11 (02) : 722 - 734
  • [32] Contrastive Consensus Graph Learning for Multi-View Clustering
    Shiping Wang
    Xincan Lin
    Zihan Fang
    Shide Du
    Guobao Xiao
    IEEE/CAA Journal of Automatica Sinica, 2022, 9 (11) : 2027 - 2030
  • [33] Consensus Graph Learning for Incomplete Multi-view Clustering
    Zhou, Wei
    Wang, Hao
    Yang, Yan
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2019, PT I, 2019, 11439 : 529 - 540
  • [34] Contrastive and attentive graph learning for multi-view clustering
    Wang, Ru
    Li, Lin
    Tao, Xiaohui
    Wang, Peipei
    Liu, Peiyu
    Information Processing and Management, 2022, 59 (04):
  • [35] Single phase multi-view clustering using unified graph learning and spectral representation
    Dornaika, F.
    El Hajjar, S.
    INFORMATION SCIENCES, 2023, 645
  • [36] Essential Tensor Learning for Multi-View Spectral Clustering
    Wu, Jianlong
    Lin, Zhouchen
    Zha, Hongbin
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (12) : 5910 - 5922
  • [37] Robust Tensor Learning for Multi-View Spectral Clustering
    Xie, Deyan
    Li, Zibao
    Sun, Yingkun
    Song, Wei
    ELECTRONICS, 2024, 13 (11)
  • [38] Tensor Learning Induced Multi-View Spectral Clustering
    Chen, Man-Sheng
    Cai, Xiao-Sha
    Lin, Jia-Qi
    Wang, Chang-Dong
    Huang, Dong
    Lai, Jian-Huang
    Jisuanji Xuebao/Chinese Journal of Computers, 2024, 47 (01): : 52 - 68
  • [39] Learning robust affinity graph representation for multi-view clustering
    Jing, Peiguang
    Su, Yuting
    Li, Zhengnan
    Nie, Liqiang
    INFORMATION SCIENCES, 2021, 544 : 155 - 167
  • [40] Adaptive Topological Graph Learning for Generalized Multi-View Clustering
    He, Wen-jue
    Zhang, Zheng
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,