QuesBELM: A BERT based Ensemble Language Model for Natural Questions

被引:0
|
作者
Pranesh, Raj Ratn [1 ]
Shekhar, Ambesh [1 ]
Pallavi, Smita [1 ]
机构
[1] Birla Inst Technol, Mesra, India
关键词
Ensemble model; Question Answering; Deep learning; Natural Language Processing; Transformer Architecture;
D O I
10.1109/icccs49678.2020.9277176
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A core goal in artificial intelligence is to build systems that can read the web, and then answer complex questions related to random searches about any topic. These question-answering (QA) systems could have a big impact on the way that we access information. In this paper, we addressed the task of question-answering (QA) systems on Google's Natural Questions (NQ) dataset containing real user questions issued to Google search and the answers found from Wikipedia by annotators. In our work, we systematically compare the performance of powerful variant models of Transformer architectures- 'BERT-base. BERT-large-WWNI and ALBERT-XXL' over Natural Questions dataset. We also propose a state-of-the-art BERT based ensemble language model- QuesBELM. QuesBELM leverages the power of existing BERT variants combined together to build a more accurate stacking ensemble model for question answering (QA) system. The model integrates top-K predictions from single language models to determine the best answer out of all. Our model surpassed the baseline language models with the Harmonic mean score of 0.731 and 0.582 for the long answer(LA) and short answer(SA) tasks respectively, reporting an average of 10% improvement over the baseline models.
引用
收藏
页数:5
相关论文
共 50 条
  • [21] Natural Language Understanding with Privacy-Preserving BERT
    Qu, Chen
    Kong, Weize
    Yang, Liu
    Zhang, Mingyang
    Bendersky, Michael
    Najork, Marc
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 1488 - 1497
  • [22] BERT-Based Logits Ensemble Model for Gender Bias and Hate Speech Detection
    Yun, Sanggeon
    Kang, Seungshik
    Kim, Hyeokman
    JOURNAL OF INFORMATION PROCESSING SYSTEMS, 2023, 19 (05): : 641 - 651
  • [23] BERT-Based Ensemble Model for Statute Law Retrieval and Legal Information Entailment
    Shao, Hsuan-Lei
    Chen, Yi-Chia
    Huang, Sieh-Chuen
    NEW FRONTIERS IN ARTIFICIAL INTELLIGENCE, JSAI-ISAI 2020, 2021, 12758 : 226 - 239
  • [24] A Named Entity Recognition Model for Manufacturing Process Based on the BERT Language Model Scheme
    Shrivastava, Manu
    Seri, Kota
    Wagatsuma, Hiroaki
    SOCIAL ROBOTICS, ICSR 2022, PT I, 2022, 13817 : 576 - 587
  • [25] BERT for the Processing of Radiological Reports: An Attention-based Natural Language Processing Algorithm
    Soffer, Shelly
    Glicksberg, Benjamin S.
    Zimlichman, Eyal
    Klang, Eyal
    ACADEMIC RADIOLOGY, 2022, 29 (04) : 634 - 635
  • [26] Advancing offensive language detection in Arabic social media: a BERT-based ensemble learning approach
    Mazari, Ahmed Cherif
    Benterkia, Asmaa
    Takdenti, Zineb
    SOCIAL NETWORK ANALYSIS AND MINING, 2024, 14 (01)
  • [27] CAN-BERT do it? Controller Area Network Intrusion Detection System based on BERT Language Model
    Alkhatib, Natasha
    Mushtaq, Maria
    Ghauch, Hadi
    Danger, Jean-Luc
    2022 IEEE/ACS 19TH INTERNATIONAL CONFERENCE ON COMPUTER SYSTEMS AND APPLICATIONS (AICCSA), 2022,
  • [28] Flow-based Network Intrusion Detection Based on BERT Masked Language Model
    Nguyen, Loc Gia
    Watabe, Kohei
    PROCEEDINGS OF THE INTERNATIONAL CONEXT STUDENT WORKSHOP 2022, CONEXT-SW 2022, 2022, : 7 - 8
  • [29] Bilattices and the Semantics of Natural Language Questions
    R. Nelken
    N. Francez
    Linguistics and Philosophy, 2002, 25 : 37 - 64
  • [30] Bilattices and the semantics of natural language questions
    Nelken, R
    Francez, N
    LINGUISTICS AND PHILOSOPHY, 2002, 25 (01) : 37 - 64