Pre-Training Methods for Question Reranking

被引:0
|
作者
Campese, Stefano [1 ,2 ]
Lauriola, Ivano [2 ]
Moschitti, Alessandro [2 ]
机构
[1] Univ Trento, Trento, Italy
[2] Amazon, Seattle, WA USA
关键词
D O I
暂无
中图分类号
学科分类号
摘要
One interesting approach to Question Answering (QA) is to search for semantically similar questions, which have been answered before. This task is different from answer retrieval as it focuses on questions rather than only on the answers, therefore it requires different model training on different data. In this work, we introduce a novel unsupervised pre-training method specialized for retrieving and ranking questions. This leverages (i) knowledge distillation from a basic question retrieval model, and (ii) new pre-training task and objective for learning to rank questions in terms of their relevance with the query. Our experiments show that (i) the proposed technique achieves state-of-the-art performance on QRC and Quora-match datasets, and (ii) the benefit of combining re-ranking and retrieval models.
引用
收藏
页码:469 / 476
页数:8
相关论文
共 50 条
  • [1] QUESTION ANSWERING SYSTEM BASED ON PRE-TRAINING MODEL AND RETRIEVAL RERANKING FOR INDUSTRY 4.0
    Chen, Ta-Fu
    Lin, Yi-Xing
    Su, Ming-Hsiang
    Chen, Po-Kai
    Tai, Tzu-Chiang
    Wang, Jia-Ching
    2023 ASIA PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, APSIPA ASC, 2023, : 2178 - 2181
  • [2] Span Selection Pre-training for Question Answering
    Glass, Michael
    Gliozzo, Alfio
    Chakravarti, Rishav
    Ferritto, Anthony
    Pan, Lin
    Bhargav, G. P. Shrivatsa
    Garg, Dinesh
    Sil, Avirup
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 2773 - 2782
  • [3] Pre-training Methods in Information Retrieval
    Fan, Yixing
    Xie, Xiaohui
    Cai, Yinqiong
    Chen, Jia
    Ma, Xinyu
    Li, Xiangsheng
    Zhang, Ruqing
    Guo, Jiafeng
    FOUNDATIONS AND TRENDS IN INFORMATION RETRIEVAL, 2022, 16 (03): : 178 - 317
  • [4] Pre-training Methods for Neural Machine Translation
    Wang, Mingxuan
    Li, Lei
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING: TUTORIAL ABSTRACTS, 2021, : 21 - 25
  • [5] Improving Knowledge Tracing via Pre-training Question Embeddings
    Liu, Yunfei
    Yang, Yang
    Chen, Xianyu
    Shen, Jian
    Zhang, Haifeng
    Yu, Yong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 1577 - 1583
  • [6] Improving Question Answering by Commonsense-Based Pre-training
    Zhong, Wanjun
    Tang, Duyu
    Duan, Nan
    Zhou, Ming
    Wang, Jiahai
    Yin, Jian
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 16 - 28
  • [7] Improving Knowledge Tracing via Pre-training Question Embeddings
    Liu, Yunfei
    Yang, Yang
    Chen, Xianyu
    Shen, Jian
    Zhang, Haifeng
    Yu, Yong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 1556 - 1562
  • [8] Comparing Evolutionary Methods for Reservoir Computing Pre-training
    Ferreira, Aida A.
    Ludermir, Teresa B.
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 283 - 290
  • [9] Design as Desired: Utilizing Visual Question Answering for Multimodal Pre-training
    Su, Tongkun
    Li, Jun
    Zhang, Xi
    Jin, Haibo
    Chen, Hao
    Wang, Qiong
    Lv, Faqin
    Zhao, Baoliang
    Hu, Ying
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2024, PT IV, 2024, 15004 : 602 - 612
  • [10] Question Answering Infused Pre-training of General-Purpose Contextualized Representations
    Jia, Robin
    Lewis, Mike
    Zettlemoyer, Luke
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 711 - 728