Slot Self-Attentive Dialogue State Tracking

被引:39
|
作者
Ye, Fanghua [1 ]
Manotumruksa, Jarana [1 ]
Zhang, Qiang [1 ]
Li, Shenghui [2 ]
Yilmaz, Emine [1 ]
机构
[1] UCL, London, England
[2] Uppsala Univ, Uppsala, Sweden
基金
英国工程与自然科学研究理事会;
关键词
dialogue state tracking; belief tracking; slot self-attention; task-oriented dialogue system;
D O I
10.1145/3442381.3449939
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An indispensable component in task-oriented dialogue systems is the dialogue state tracker, which keeps track of users' intentions in the course of conversation. The typical approach towards this goal is to fill in multiple pre-defined slots that are essential to complete the task. Although various dialogue state tracking methods have been proposed in recent years, most of them predict the value of each slot separately and fail to consider the correlations among slots. In this paper, we propose a slot self-attention mechanism that can learn the slot correlations automatically. Specifically, a slot-token attention is first utilized to obtain slot-specific features from the dialogue context. Then a stacked slot self-attention is applied on these features to learn the correlations among slots. We conduct comprehensive experiments on two multi-domain task-oriented dialogue datasets, including MultiWOZ 2.0 and MultiWOZ 2.1. The experimental results demonstrate that our approach achieves state-of-the-art performance on both datasets, verifying the necessity and effectiveness of taking slot correlations into consideration.
引用
收藏
页码:1598 / 1608
页数:11
相关论文
共 50 条
  • [21] Self-Attentive Moving Average for Time Series Prediction
    Su, Yaxi
    Cui, Chaoran
    Qu, Hao
    APPLIED SCIENCES-BASEL, 2022, 12 (07):
  • [22] Graph convolutional network and self-attentive for sequential recommendation
    Guo, Kaifeng
    Zeng, Guolei
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [23] Deep Fourier Kernel for Self-Attentive Point Processes
    Zhu, Shixiang
    Zhang, Minghe
    Ding, Ruyi
    Xie, Yao
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [24] Dialogue State Distillation Network with Inter-slot Contrastive Learning for Dialogue State Tracking
    Xu, Jing
    Song, Dandan
    Liu, Chong
    Hui, Siu Cheung
    Li, Fei
    Ju, Qiang
    He, Xiaonan
    Xie, Jian
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 11, 2023, : 13834 - 13842
  • [25] Self-attentive Rationalization for Interpretable Graph Contrastive Learning
    Li, Sihang
    Luo, Yanchen
    Zhang, An
    Wang, Xiang
    Li, Longfei
    Zhou, Jun
    Chua, Tat-seng
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2025, 19 (02)
  • [26] Self-Attentive Similarity Measurement Strategies in Speaker Diarization
    Lin, Qingjian
    Hou, Yu
    Li, Ming
    INTERSPEECH 2020, 2020, : 284 - 288
  • [27] Explicit Sparse Self-Attentive Network for CTR Prediction
    Luo, Yu
    Peng, Wanwan
    Fan, Youping
    Pang, Hong
    Xu, Xiang
    Wu, Xiaohua
    PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE OF INFORMATION AND COMMUNICATION TECHNOLOGY, 2021, 183 : 690 - 695
  • [28] Improving Disfluency Detection by Self-Training a Self-Attentive Model
    Lou, Paria Jamshid
    Johnson, Mark
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3754 - 3763
  • [29] A self-attentive model for tracing knowledge and engagement in parallel
    Jiang, Hua
    Xiao, Bing
    Luo, Yintao
    Ma, Junliang
    PATTERN RECOGNITION LETTERS, 2023, 165 : 25 - 32
  • [30] Locker: Locally Constrained Self-Attentive Sequential Recommendation
    He, Zhankui
    Zhao, Handong
    Wang, Zhaowen
    Lin, Zhe
    Kale, Ajinkya
    McAuley, Julian
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3088 - 3092