Effective sample pairs based contrastive learning for clustering

被引:14
|
作者
Yin, Jun [1 ]
Wu, Haowei [1 ]
Sun, Shiliang [2 ,3 ]
机构
[1] Shanghai Maritime Univ, Coll Informat Engn, Shanghai 201306, Peoples R China
[2] East China Normal Univ, Sch Comp Sci & Technol, Shanghai 200062, Peoples R China
[3] Minist Educ, Key Lab Adv Theory & Applicat Stat & Data Sci, Shanghai 200062, Peoples R China
基金
中国国家自然科学基金;
关键词
Representation learning; Contrastive learning; Deep clustering; Nearest neighbor;
D O I
10.1016/j.inffus.2023.101899
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As an indispensable branch of unsupervised learning, deep clustering is rapidly emerging along with the growth of deep neural networks. Recently, contrastive learning paradigm has been combined with deep clustering to achieve more competitive performance. However, previous works mostly employ random augmentations to construct sample pairs for contrastive clustering. Different augmentations of a sample are treated as positive sample pairs, which may result in false positives and ignore the semantic variations of different samples. To address these limitations, we present a novel end-to-end contrastive clustering framework termed Contrastive Clustering with Effective Sample pairs construction (CCES), which obtains more semantic information by jointly leveraging an effective data augmentation method ContrastiveCrop and constructing positive sample pairs based on nearest-neighbor mining. Specifically, we augment original samples by adopting ContrastiveCrop, which explicitly reduces false positives and enlarges the variance of samples. Further, with the extracted feature representations, we provide a strategy to construct positive sample pairs via a sample and its nearest neighbor for instance-wise and cluster-wise contrastive learning. Experimental results on four challenging datasets demonstrate the effectiveness of CCES for clustering, which surpasses the state-of-the-art deep clustering methods.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Unsupervised vehicle re-identification based on mixed sample contrastive learning
    Yuefeng Wang
    Ying Wei
    Ruipeng Ma
    Lin Wang
    Cuyuan Wang
    Signal, Image and Video Processing, 2022, 16 : 2083 - 2091
  • [32] Contrastive Learning in Single-cell Multiomics Clustering
    Li, Bingjun
    Nabavi, Sheida
    14TH ACM CONFERENCE ON BIOINFORMATICS, COMPUTATIONAL BIOLOGY, AND HEALTH INFORMATICS, BCB 2023, 2023,
  • [33] Clustering Enhanced Multiplex Graph Contrastive Representation Learning
    Yuan, Ruiwen
    Tang, Yongqiang
    Wu, Yajing
    Zhang, Wensheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1341 - 1355
  • [34] Contrastive self-representation learning for data clustering
    Zhao, Wenhui
    Gao, Quanxue
    Mei, Shikun
    Yang, Ming
    NEURAL NETWORKS, 2023, 167 : 648 - 655
  • [35] Graph Debiased Contrastive Learning with Joint Representation Clustering
    Zhao, Han
    Yang, Xu
    Wang, Zhenru
    Yang, Erkun
    Deng, Cheng
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 3434 - 3440
  • [36] Semantic Spectral Clustering with Contrastive Learning and Neighbor Mining
    Nongxiao Wang
    Xulun Ye
    Jieyu Zhao
    Qing Wang
    Neural Processing Letters, 56
  • [37] Deep Grassmannian multiview subspace clustering with contrastive learning
    Wang, Rui
    Li, Haiqiang
    Hu, Chen
    Wu, Xiao-Jun
    Bao, Yingfang
    ELECTRONIC RESEARCH ARCHIVE, 2024, 32 (09): : 5424 - 5450
  • [38] Graph Clustering with High-Order Contrastive Learning
    Li, Wang
    Zhu, En
    Wang, Siwei
    Guo, Xifeng
    ENTROPY, 2023, 25 (10)
  • [39] Deep Survival Analysis With Latent Clustering and Contrastive Learning
    Cui, Chang
    Tang, Yongqiang
    Zhang, Wensheng
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2024, 28 (05) : 3090 - 3101
  • [40] Dual contrastive learning for multi-view clustering
    Bao, Yichen
    Zhao, Wenhui
    Zhao, Qin
    Gao, Quanxue
    Yang, Ming
    NEUROCOMPUTING, 2024, 599