Self-Guided Contrastive Learning for BERT Sentence Representations

被引:0
|
作者
Kim, Taeuk [1 ]
Yoo, Kang Min [2 ]
Lee, Sang-goo [1 ]
机构
[1] Seoul Natl Univ, Dept Comp Sci & Engn, Seoul, South Korea
[2] NAVER AI Lab, Seongnam, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although BERT and its variants have reshaped the NLP landscape, it still remains unclear how best to derive sentence embeddings from such pre-trained Transformers. In this work, we propose a contrastive learning method that utilizes self-guidance for improving the quality of BERT sentence representations. Our method fine-tunes BERT in a self-supervised fashion, does not rely on data augmentation, and enables the usual [CLS] token embeddings to function as sentence vectors. Moreover, we redesign the contrastive learning objective (NT-Xent) and apply it to sentence representation learning. We demonstrate with extensive experiments that our approach is more effective than competitive baselines on diverse sentence-related tasks. We also show it is efficient at inference and robust to domain shifts.
引用
收藏
页码:2528 / 2540
页数:13
相关论文
共 50 条
  • [1] Self-guided Contrastive Learning for Sequential Recommendation
    Shi, Hui
    Du, Hanwen
    Hao, Yongjing
    Sheng, Victor S.
    Cui, Zhiming
    Zhao, Pengpeng
    WEB AND BIG DATA, PT III, APWEB-WAIM 2022, 2023, 13423 : 72 - 86
  • [2] Simple Flow-Based Contrastive Learning for BERT Sentence Representations
    Tian, Ziyi
    Liu, Qun
    Liu, Maotao
    Deng, Wei
    ADVANCES IN SWARM INTELLIGENCE, ICSI 2022, PT II, 2022, : 265 - 275
  • [3] Contrastive Learning Models for Sentence Representations
    Xu, Lingling
    Xie, Haoran
    Li, Zongxi
    Wang, Fu Lee
    Wang, Weiming
    Li, Qing
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2023, 14 (04)
  • [4] Learning to Perturb for Contrastive Learning of Unsupervised Sentence Representations
    Zhou, Kun
    Zhou, Yuanhang
    Zhao, Wayne Xin
    Wen, Ji-Rong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 3935 - 3944
  • [5] Debiased Contrastive Learning of Unsupervised Sentence Representations
    Zhou, Kun
    Zhang, Beichen
    Zhao, Wayne Xin
    Wen, Ji-Rong
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 6120 - 6130
  • [6] Pairwise Supervised Contrastive Learning of Sentence Representations
    Zhang, Dejiao
    Li, Shang-Wen
    Xiao, Wei
    Zhu, Henghui
    Nallapati, Ramesh
    Arnold, Andrew O.
    Xiang, Bing
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 5786 - 5798
  • [7] Virtual Augmentation Supported Contrastive Learning of Sentence Representations
    Zhang, Dejiao
    Xiao, Wei
    Zhu, Henghui
    Ma, Xiaofei
    Arnold, Andrew O.
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 864 - 876
  • [8] Self-Guided Learning to Denoise for Robust Recommendation
    Gao, Yunjun
    Du, Yuntao
    Hu, Yujia
    Chen, Lu
    Zhu, Xinjun
    Fang, Ziquan
    Zheng, Baihua
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 1412 - 1422
  • [9] Partial Label Learning with Self-Guided Retraining
    Feng, Lei
    An, Bo
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 3542 - 3549
  • [10] Contextualized and Generalized Sentence Representations by Contrastive Self-Supervised Learning: A Case Study on Discourse Relation Analysis
    Kiyomaru, Hirokazu
    Kurohashi, Sadao
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 5578 - 5584