Slot Self-Attentive Dialogue State Tracking

被引:39
|
作者
Ye, Fanghua [1 ]
Manotumruksa, Jarana [1 ]
Zhang, Qiang [1 ]
Li, Shenghui [2 ]
Yilmaz, Emine [1 ]
机构
[1] UCL, London, England
[2] Uppsala Univ, Uppsala, Sweden
基金
英国工程与自然科学研究理事会;
关键词
dialogue state tracking; belief tracking; slot self-attention; task-oriented dialogue system;
D O I
10.1145/3442381.3449939
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An indispensable component in task-oriented dialogue systems is the dialogue state tracker, which keeps track of users' intentions in the course of conversation. The typical approach towards this goal is to fill in multiple pre-defined slots that are essential to complete the task. Although various dialogue state tracking methods have been proposed in recent years, most of them predict the value of each slot separately and fail to consider the correlations among slots. In this paper, we propose a slot self-attention mechanism that can learn the slot correlations automatically. Specifically, a slot-token attention is first utilized to obtain slot-specific features from the dialogue context. Then a stacked slot self-attention is applied on these features to learn the correlations among slots. We conduct comprehensive experiments on two multi-domain task-oriented dialogue datasets, including MultiWOZ 2.0 and MultiWOZ 2.1. The experimental results demonstrate that our approach achieves state-of-the-art performance on both datasets, verifying the necessity and effectiveness of taking slot correlations into consideration.
引用
收藏
页码:1598 / 1608
页数:11
相关论文
共 50 条
  • [1] Global-Locally Self-Attentive Dialogue State Tracker
    Zhong, Victor
    Xiong, Caiming
    Socher, Richard
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1458 - 1467
  • [2] Self-Attentive Associative Memory
    Le, Hung
    Tran, Truyen
    Venkatesh, Svetha
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [3] Self-Attentive Sequential Recommendation
    Kang, Wang-Cheng
    McAuley, Julian
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 197 - 206
  • [4] On the Robustness of Self-Attentive Models
    Hsieh, Yu-Lun
    Cheng, Minhao
    Juan, Da-Cheng
    Wei, Wei
    Hsu, Wen-Lian
    Hsieh, Cho-Jui
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1520 - 1529
  • [5] Denoising Self-Attentive Sequential Recommendation
    Chen, Huiyuan
    Lin, Yusan
    Pan, Menghai
    Wang, Lan
    Yeh, Chin-Chia Michael
    Li, Xiaoting
    Zheng, Yan
    Wang, Fei
    Yang, Hao
    PROCEEDINGS OF THE 16TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2022, 2022, : 92 - 101
  • [6] SAED: self-attentive energy disaggregation
    Virtsionis-Gkalinikis, Nikolaos
    Nalmpantis, Christoforos
    Vrakas, Dimitris
    MACHINE LEARNING, 2023, 112 (11) : 4081 - 4100
  • [7] SAED: self-attentive energy disaggregation
    Nikolaos Virtsionis-Gkalinikis
    Christoforos Nalmpantis
    Dimitris Vrakas
    Machine Learning, 2023, 112 : 4081 - 4100
  • [8] A SELF-ATTENTIVE EMOTION RECOGNITION NETWORK
    Partaourides, Harris
    Papadamou, Kostantinos
    Kourtellis, Nicolas
    Leontiades, Ilias
    Chatzis, Sotirios
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 7199 - 7203
  • [9] Lightweight Self-Attentive Sequential Recommendation
    Li, Yang
    Chen, Tong
    Zhang, Peng-Fei
    Yin, Hongzhi
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 967 - 977
  • [10] Constituency Parsing with a Self-Attentive Encoder
    Kitaev, Nikita
    Klein, Dan
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 2676 - 2686