QuesBELM: A BERT based Ensemble Language Model for Natural Questions

被引:0
|
作者
Pranesh, Raj Ratn [1 ]
Shekhar, Ambesh [1 ]
Pallavi, Smita [1 ]
机构
[1] Birla Inst Technol, Mesra, India
关键词
Ensemble model; Question Answering; Deep learning; Natural Language Processing; Transformer Architecture;
D O I
10.1109/icccs49678.2020.9277176
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A core goal in artificial intelligence is to build systems that can read the web, and then answer complex questions related to random searches about any topic. These question-answering (QA) systems could have a big impact on the way that we access information. In this paper, we addressed the task of question-answering (QA) systems on Google's Natural Questions (NQ) dataset containing real user questions issued to Google search and the answers found from Wikipedia by annotators. In our work, we systematically compare the performance of powerful variant models of Transformer architectures- 'BERT-base. BERT-large-WWNI and ALBERT-XXL' over Natural Questions dataset. We also propose a state-of-the-art BERT based ensemble language model- QuesBELM. QuesBELM leverages the power of existing BERT variants combined together to build a more accurate stacking ensemble model for question answering (QA) system. The model integrates top-K predictions from single language models to determine the best answer out of all. Our model surpassed the baseline language models with the Harmonic mean score of 0.731 and 0.582 for the long answer(LA) and short answer(SA) tasks respectively, reporting an average of 10% improvement over the baseline models.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Natural language based analysis of SQuAD: An analytical approach for BERT
    Guven, Zekeriya Anil
    Unalir, Murat Osman
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 195
  • [2] Q-BERT: A BERT-based Framework for Computing SPARQL Similarity in Natural Language
    Wang, Chunpei
    Zhang, Xiaowang
    WWW'20: COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2020, 2020, : 65 - 66
  • [3] A DeBERTa-Based Semantic Conversion Model for Spatiotemporal Questions in Natural Language
    Lu, Wenjuan
    Ming, Dongping
    Mao, Xi
    Wang, Jizhou
    Zhao, Zhanjie
    Cheng, Yao
    APPLIED SCIENCES-BASEL, 2025, 15 (03):
  • [4] BERT Model-based Natural Language to NoSQL Query Conversion using Deep Learning Approach
    Hossen, Kazi Mojammel
    Uddin, Mohammed Nasir
    Arefin, Minhazul
    Uddin, Md Ashraf
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (02) : 810 - 821
  • [5] Krishiq-BERT: A Few-Shot Setting BERT Model to Answer Agricultural-Related Questions in the Kannada Language
    Ajawan P.
    Desai V.
    Kale S.
    Patil S.
    Journal of The Institution of Engineers (India): Series B, 2024, 105 (02) : 285 - 296
  • [6] A learning-based model for semantic mapping from natural language questions to OWL
    Gao, Mingxia
    Liu, Jiming
    Zhong, Ning
    Liu, Chunnian
    Chen, Furong
    ROUGH SETS AND INTELLIGENT SYSTEMS PARADIGMS, PROCEEDINGS, 2007, 4585 : 803 - +
  • [7] Pretrained Natural Language Processing Model for Intent Recognition (BERT-IR)
    Vasima Khan
    Tariq Azfar Meenai
    Human-Centric Intelligent Systems, 2021, 1 (3-4): : 66 - 74
  • [8] Chinese BERT Attack Method Based on Masked Language Model
    Zhang Y.-T.
    Ye L.
    Tang H.-L.
    Zhang H.-L.
    Li S.
    Ruan Jian Xue Bao/Journal of Software, 2024, 35 (07): : 3392 - 3409
  • [9] TinyBERT: Distilling BERT for Natural Language Understanding
    Jiao, Xiaoqi
    Yin, Yichun
    Shang, Lifeng
    Jiang, Xin
    Chen, Xiao
    Li, Linlin
    Wang, Fang
    Liu, Qun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4163 - 4174
  • [10] BERT applications in natural language processing: a review
    Gardazi, Nadia Mushtaq
    Daud, Ali
    Malik, Muhammad Kamran
    Bukhari, Amal
    Alsahfi, Tariq
    Alshemaimri, Bader
    ARTIFICIAL INTELLIGENCE REVIEW, 2025, 58 (06)