CLUCDD: CONTRASTIVE DIALOGUE DISENTANGLEMENT VIA CLUSTERING

被引:0
|
作者
Gao, Jingsheng [1 ]
Li, Zeyu [1 ]
Xiang, Suncheng [1 ]
Liu, Ting [1 ]
Fu, Yuzhuo [1 ]
机构
[1] Shanghai Jiao Tong Univ, Sch Elect Informat & Elect Engn, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
Dialogue Disentanglement; Contrastive Learning; Sequential Information; Clustering; BERT;
D O I
10.1109/ICASSPW59220.2023.10193381
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
A huge number of multi-participant dialogues happen online every day, which leads to difficulty in understanding the nature of dialogue dynamics for both humans and machines. Dialogue disentanglement aims at separating an entangled dialogue into detached sessions, thus increasing the readability of long disordered dialogue. Previous studies mainly focus on message-pair classification and clustering in two-step methods, which cannot guarantee the whole clustering performance in a dialogue. To address this challenge, we propose a simple yet effective model named CluCDD, which aggregates utterances by contrastive learning. More specifically, our model pulls utterances in the same session together and pushes away utterances in different ones. Then a clustering method is adopted to generate predicted clustering labels. Comprehensive experiments conducted on the Movie Dialogue dataset and IRC dataset demonstrate that our model achieves a new state-of-the-art result(1).
引用
收藏
页数:5
相关论文
共 50 条
  • [41] Robust image clustering via context-aware contrastive graph learning
    Fang, Uno
    Li, Jianxin
    Lu, Xuequan
    Mian, Ajmal
    Gu, Zhaoquan
    PATTERN RECOGNITION, 2023, 138
  • [42] Improving Event Representation via Simultaneous Weakly Supervised Contrastive Learning and Clustering
    Gao, Jun
    Wang, Wei
    Yu, Changlong
    Zhao, Huan
    Ng, Wilfred
    Xu, Ruifeng
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 3036 - 3049
  • [43] Induce Spoken Dialog Intents via Deep Unsupervised Context Contrastive Clustering
    Wu, Ting-Wei
    Juang, Biing-Hwang
    INTERSPEECH 2022, 2022, : 1081 - 1085
  • [44] Disentanglement via Latent Quantization
    Hsu, Kyle
    Dorrell, Will
    Whittington, James C. R.
    Wu, Jiajun
    Finn, Chelsea
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [45] Unsupervised Cross-View Subspace Clustering via Adaptive Contrastive Learning
    Zhang, Zihao
    Wang, Qianqian
    Gao, Quanxue
    Pei, Chengquan
    Feng, Wei
    IEEE TRANSACTIONS ON BIG DATA, 2024, 10 (05) : 609 - 619
  • [46] Conversation- and Tree-Structure Losses for Dialogue Disentanglement
    Li, Tianda
    Gu, Jia-Chen
    Ling, Zhen-Hua
    Liu, Quan
    PROCEEDINGS OF THE SECOND DIALDOC WORKSHOP ON DOCUMENT-GROUNDED DIALOGUE AND CONVERSATIONAL QUESTION ANSWERING (DIALDOC 2022), 2022, : 54 - 64
  • [47] Simple Contrastive Graph Clustering
    Liu, Yue
    Yang, Xihong
    Zhou, Sihang
    Liu, Xinwang
    Wang, Siwei
    Liang, Ke
    Tu, Wenxuan
    Li, Liang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (10) : 13789 - 13800
  • [48] Pyramid contrastive learning for clustering
    Zhou, Zi-Feng
    Huang, Dong
    Wang, Chang-Dong
    NEURAL NETWORKS, 2025, 185
  • [49] Deep Temporal Contrastive Clustering
    Zhong, Ying
    Huang, Dong
    Wang, Chang-Dong
    NEURAL PROCESSING LETTERS, 2023, 55 (06) : 7869 - 7885
  • [50] Supporting Clustering with Contrastive Learning
    Zhang, Dejiao
    Nan, Feng
    Wei, Xiaokai
    Li, Shang-Wen
    Zhu, Henghui
    McKeown, Kathleen
    Nallapati, Ramesh
    Arnold, Andrew O.
    Xiang, Bing
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 5419 - 5430