Natural language based analysis of SQuAD: An analytical approach for BERT

被引:12
|
作者
Guven, Zekeriya Anil [1 ]
Unalir, Murat Osman [1 ]
机构
[1] Ege Univ, Dept Comp Engn, Izmir, Turkey
关键词
Natural language processing; BERT; Text analysis; Question answering; SQuAD;
D O I
10.1016/j.eswa.2022.116592
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, deep learning models have been used in the implementation of question answering systems. In this study, the performance of the question answering system was evaluated from the perspective of natural language processing using SQuAD, which was developed to measure the performance of deep learning language models. In line with the evaluations, in order to increase the performance, 3 natural language based methods, namely RNP, that can be used with pre-trained BERT language models have been proposed and they have increased the performance of the question answering system in which the pre-trained BERT models are used by 1.1% to 2.4%. As a result of the application of RNP methods with sentence selection, an increase in accuracy between 6.6% and 8.76% was achieved in answer detection. Since these methods don't require any training process, it has been shown that they can be used in question answering systems to increase the performance of any deep learning model.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] QuesBELM: A BERT based Ensemble Language Model for Natural Questions
    Pranesh, Raj Ratn
    Shekhar, Ambesh
    Pallavi, Smita
    PROCEEDINGS OF THE 2020 5TH INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND SECURITY (ICCCS-2020), 2020,
  • [2] Q-BERT: A BERT-based Framework for Computing SPARQL Similarity in Natural Language
    Wang, Chunpei
    Zhang, Xiaowang
    WWW'20: COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2020, 2020, : 65 - 66
  • [3] BERT Model-based Natural Language to NoSQL Query Conversion using Deep Learning Approach
    Hossen, Kazi Mojammel
    Uddin, Mohammed Nasir
    Arefin, Minhazul
    Uddin, Md Ashraf
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (02) : 810 - 821
  • [4] Academic Aggregated Search Approach Based on BERT Language Model
    Achsas, Sanae
    Nfaoui, El Habib
    2022 2ND INTERNATIONAL CONFERENCE ON INNOVATIVE RESEARCH IN APPLIED SCIENCE, ENGINEERING AND TECHNOLOGY (IRASET'2022), 2022, : 944 - 952
  • [5] tRF-BERT: A transformative approach to aspect-based sentiment analysis in the bengali language
    Ahmed, Shihab
    Samia, Moythry Manir
    Sayma, Maksuda Haider
    Kabir, Md. Mohsin
    Mridha, M. F.
    PLOS ONE, 2024, 19 (09):
  • [6] TinyBERT: Distilling BERT for Natural Language Understanding
    Jiao, Xiaoqi
    Yin, Yichun
    Shang, Lifeng
    Jiang, Xin
    Chen, Xiao
    Li, Linlin
    Wang, Fang
    Liu, Qun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4163 - 4174
  • [7] BERT applications in natural language processing: a review
    Gardazi, Nadia Mushtaq
    Daud, Ali
    Malik, Muhammad Kamran
    Bukhari, Amal
    Alsahfi, Tariq
    Alshemaimri, Bader
    ARTIFICIAL INTELLIGENCE REVIEW, 2025, 58 (06)
  • [8] BERT for Natural Language Processing in Bahasa Indonesia
    Sebastian, Danny
    Purnomo, Hindriyanto Dwi
    Sembiring, Irwan
    2022 2ND INTERNATIONAL CONFERENCE ON INTELLIGENT CYBERNETICS TECHNOLOGY & APPLICATIONS (ICICYTA), 2022, : 204 - 209
  • [9] A Hybrid Approach to Dimensional Aspect-Based Sentiment Analysis Using BERT and Large Language Models
    Zhang, Yice
    Xu, Hongling
    Zhang, Delong
    Xu, Ruifeng
    ELECTRONICS, 2024, 13 (18)
  • [10] Natural Language Understanding with Privacy-Preserving BERT
    Qu, Chen
    Kong, Weize
    Yang, Liu
    Zhang, Mingyang
    Bendersky, Michael
    Najork, Marc
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 1488 - 1497