Self-supervised deep subspace clustering with entropy-norm

被引:0
|
作者
Guangyi Zhao
Simin Kou
Xuesong Yin
Guodao Zhang
Yigang Wang
机构
[1] Hangzhou Dianzi University,Department of Digital Media Technology
来源
Cluster Computing | 2024年 / 27卷
关键词
Deep subspace clustering; Self-supervise; Contrastive learning; Entropy-norm;
D O I
暂无
中图分类号
学科分类号
摘要
Auto-Encoder based Deep Subspace Clustering (DSC) has been widely applied in computer vision, motion segmentation and image processing. However, existing DSC methods suffer from two limitations: (1) they ignore the rich useful relational information and the connectivity within each subspace due to the reconstruction loss; (2) they design convolutional networks individually according to specific datasets. To address the above problems and improve the performance of DSC, we propose a novel algorithm called Self-Supervised deep Subspace Clustering with Entropy-norm(S3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$^{3}$$\end{document}CE) in this paper. Firstly, S3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$^{3}$$\end{document}CE introduces self-supervised contrastive learning to pre-train the encoder instead of requiring a decoder. Besides, the trained encoder is used as a feature extractor to segment subspace by combining self-expression layer and entropy-norm constraint. This not only preserves the local structure of data, but also improves the connectivity between data points. Extensive experimental results demonstrate the superior performance of S3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$^{3}$$\end{document}CE in comparison to the state-of-the-art approaches.
引用
收藏
页码:1611 / 1623
页数:12
相关论文
共 50 条
  • [21] Deep self-supervised clustering with embedding adjacent graph features
    Jiang, Xiao
    Qian, Pengjiang
    Jiang, Yizhang
    Gu, Yi
    Chen, Aiguo
    SYSTEMS SCIENCE & CONTROL ENGINEERING, 2022, 10 (01) : 336 - 346
  • [22] Self-supervised Deep Correlational Multi-view Clustering
    Xin, Bowen
    Zeng, Shan
    Wang, Xiuying
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [23] Dual Alignment Self-Supervised Incomplete Multi-View Subspace Clustering Network
    Zhao, Liang
    Zhang, Jie
    Wang, Qiuhao
    Chen, Zhikui
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 2122 - 2126
  • [24] Deep Self-Supervised Clustering of the Dark Web for Cyber Threat Intelligence
    Kadoguchi, Masashi
    Kobayashi, Hanae
    Hayashi, Shota
    Otsuka, Akira
    Hashimoto, Masaki
    2020 IEEE INTERNATIONAL CONFERENCE ON INTELLIGENCE AND SECURITY INFORMATICS (ISI), 2020, : 163 - 168
  • [25] Deep Self-Supervised Attributed Graph Clustering for Social Network Analysis
    Lu, Hu
    Hong, Haotian
    Geng, Xia
    NEURAL PROCESSING LETTERS, 2024, 56 (02)
  • [26] Deep Self-Supervised Attributed Graph Clustering for Social Network Analysis
    Hu Lu
    Haotian Hong
    Xia Geng
    Neural Processing Letters, 56
  • [27] Deep Self-Supervised Graph Attention Convolution Autoencoder for Networks Clustering
    Chen, Chao
    Lu, Hu
    Hong, Haotian
    Wang, Hai
    Wan, Shaohua
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2023, 69 (04) : 974 - 983
  • [28] Self-supervised autoencoders for clustering and classification
    Paraskevi Nousi
    Anastasios Tefas
    Evolving Systems, 2020, 11 : 453 - 466
  • [29] Self-supervised autoencoders for clustering and classification
    Nousi, Paraskevi
    Tefas, Anastasios
    EVOLVING SYSTEMS, 2020, 11 (03) : 453 - 466
  • [30] Hypergraph-Supervised Deep Subspace Clustering
    Hu, Yu
    Cai, Hongmin
    MATHEMATICS, 2021, 9 (24)