Pre-Training Methods for Question Reranking

被引:0
|
作者
Campese, Stefano [1 ,2 ]
Lauriola, Ivano [2 ]
Moschitti, Alessandro [2 ]
机构
[1] Univ Trento, Trento, Italy
[2] Amazon, Seattle, WA USA
关键词
D O I
暂无
中图分类号
学科分类号
摘要
One interesting approach to Question Answering (QA) is to search for semantically similar questions, which have been answered before. This task is different from answer retrieval as it focuses on questions rather than only on the answers, therefore it requires different model training on different data. In this work, we introduce a novel unsupervised pre-training method specialized for retrieving and ranking questions. This leverages (i) knowledge distillation from a basic question retrieval model, and (ii) new pre-training task and objective for learning to rank questions in terms of their relevance with the query. Our experiments show that (i) the proposed technique achieves state-of-the-art performance on QRC and Quora-match datasets, and (ii) the benefit of combining re-ranking and retrieval models.
引用
收藏
页码:469 / 476
页数:8
相关论文
共 50 条
  • [21] Exploring Visual Pre-training for Robot Manipulation: Datasets, Models and Methods
    Jing, Ya
    Zhu, Xuelin
    Liu, Xingbin
    Sima, Qie
    Yang, Taozheng
    Feng, Yunhai
    Kong, Tao
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2023, : 11390 - 11395
  • [22] Pre-training phenotyping classifiers
    Dligach, Dmitriy
    Afshar, Majid
    Miller, Timothy
    JOURNAL OF BIOMEDICAL INFORMATICS, 2021, 113 (113)
  • [23] PERM: Pre-training Question Embeddings via Relation Map for Improving Knowledge Tracing
    Wang, Wentao
    Ma, Huifang
    Zhao, Yan
    Yang, Fanyi
    Chang, Liang
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT III, 2022, : 281 - 288
  • [24] SEEP: Semantic-enhanced question embeddings pre-training for improving knowledge tracing
    Wang, Wentao
    Ma, Huifang
    Zhao, Yan
    Yang, Fanyi
    Chang, Liang
    INFORMATION SCIENCES, 2022, 614 : 153 - 169
  • [25] SEEP: Semantic-enhanced question embeddings pre-training for improving knowledge tracing
    Wang, Wentao
    Ma, Huifang
    Zhao, Yan
    Yang, Fanyi
    Chang, Liang
    INFORMATION SCIENCES, 2022, 614 : 153 - 169
  • [26] Rethinking Pre-training and Self-training
    Zoph, Barret
    Ghiasi, Golnaz
    Lin, Tsung-Yi
    Cui, Yin
    Liu, Hanxiao
    Cubuk, Ekin D.
    Le, Quoc V.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [27] How to Pre-train Your Model? Comparison of Different Pre-training Models for Biomedical Question Answering
    Kamath, Sanjay
    Grau, Brigitte
    Ma, Yue
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 1168 : 646 - 660
  • [28] Contrastive Pre-training and Representation Distillation for Medical Visual Question Answering Based on Radiology Images
    Liu, Bo
    Zhan, Li-Ming
    Wu, Xiao-Ming
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT II, 2021, 12902 : 210 - 220
  • [29] Domain Specific Pre-training Methods for Traditional Chinese Medicine Prescription Recommendation
    Li, Wei
    Yang, Zheng
    Shao, Yanqiu
    ARTIFICIAL INTELLIGENCE, CICAI 2023, PT II, 2024, 14474 : 125 - 135
  • [30] Hyperlink-induced Pre-training for Passage Retrieval in Open-domain Question Answering
    Zhou, Jiawei
    Li, Xiaoguang
    Shang, Lifeng
    Luo, Lan
    Zhan, Ke
    Hu, Enrui
    Zhang, Xinyu
    Jiang, Hao
    Cao, Zhao
    Yu, Fan
    Jiang, Xin
    Liu, Qun
    Chen, Lei
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 7135 - 7146