An Open-Domain Search Quiz Engine Based on Transformer

被引:0
|
作者
Niu, Xiaoling [1 ]
Guo, Ge [1 ]
机构
[1] Pingdingshan Inst Ind Technol, Dept Comp Sci & Software Engn, Pingdingshan 467000, Peoples R China
关键词
Natural language processing; deep learning; transformer; Bi-LSTM; semantic understanding;
D O I
10.14569/IJACSA.2024.01509103
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
the volume of information on the Internet continues to grow exponentially, efficient retrieval of relevant data has become a significant challenge. Traditional keyword matching techniques, while useful, often fall short in addressing the complex and varied queries users present. This paper introduces a novel approach to automated question and answer systems by integrating deep learning and natural language processing (NLP) technologies. Specifically, it combines the Transformer model with the HowNet knowledge base to enhance semantic understanding and contextual relevance of responses. The proposed system architecture includes layers for word embedding, Transformer encoding, attention mechanisms, and Bi-directional Long Short- Term Memory (Bi-LSTM) processing, enabling sophisticated semantic matching and implication recognition. Using the BQ Corpus dataset in the banking and finance domain, the system demonstrated substantial improvements in accuracy and F1-score over existing models. The primary contributions of this research are threefold: (1) the introduction of a semantic fusion approach using HowNet for enhanced contextual understanding, (2) the optimization of Transformer-based deep learning techniques for Q&A systems, and (3) a comprehensive evaluation using the BQ Corpus dataset, demonstrating significant improvements in accuracy and F1-score over baseline models. These contributions have important implications for improving the handling of complex and synonym-rich queries in automated Q&A systems. The experimental results highlight that the integrated approach significantly enhances the performance of automated Q&A systems, offering a more efficient and accurate means of information retrieval. This advancement is particularly crucial in the era of big data and Web 3.0, where the ability to quickly and accurately access relevant information is essential for both users and organizations.
引用
收藏
页码:1011 / 1020
页数:10
相关论文
共 50 条
  • [1] Opinion Sentence Search Engine on Open-domain Blog
    Furuse, Osamu
    Hiroshima, Nobuaki
    Yamada, Setsuo
    Kataoka, Ryoji
    20TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2007, : 2760 - 2765
  • [2] Open-domain conversational search assistants: the Transformer is all you need
    Ferreira, Rafael
    Leite, Mariana
    Semedo, David
    Magalhaes, Joao
    INFORMATION RETRIEVAL JOURNAL, 2022, 25 (02): : 123 - 148
  • [3] Open-domain conversational search assistants: the Transformer is all you need
    Rafael Ferreira
    Mariana Leite
    David Semedo
    Joao Magalhaes
    Information Retrieval Journal, 2022, 25 : 123 - 148
  • [4] A Randomized Link Transformer for Diverse Open-Domain Dialogue Generation
    Lee, Jing Yang
    Lee, Kong Aik
    Gan, Woon Seng
    PROCEEDINGS OF THE 4TH WORKSHOP ON NLP FOR CONVERSATIONAL AI, 2022, : 1 - 11
  • [5] Constraint-Based Open-Domain Question Answering Using Knowledge Graph Search
    Aghaebrahimian, Ahmad
    Jurcicek, Filip
    TEXT, SPEECH, AND DIALOGUE, 2016, 9924 : 28 - 36
  • [6] Open-Domain Long-Form Question–Answering Using Transformer-Based Pipeline
    Dash A.
    Awachar M.
    Patel A.
    Rudra B.
    SN Computer Science, 4 (5)
  • [7] Natural Language Generation Using Transformer Network in an Open-Domain Setting
    Varshney, Deeksha
    Ekbal, Asif
    Nagaraja, Ganesh Prasad
    Tiwari, Mrigank
    Gopinath, Abhijith Athreya Mysore
    Bhattacharyya, Pushpak
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2020), 2020, 12089 : 82 - 93
  • [8] Model-based Merging of Open-Domain Ontologies
    Bouraoui, Zied
    Konieczny, Sebastien
    Truong-Thanh Ma
    Varzinczak, Ivan
    2020 IEEE 32ND INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI), 2020, : 29 - 34
  • [9] SPARTA: Efficient Open-Domain Question Answering via Sparse Transformer Matching Retrieval
    Zhao, Tiancheng
    Lu, Xiaopeng
    Lee, Kyusong
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 565 - 575
  • [10] Learning Open-domain Comparable Entity Graphs from User Search Queries
    Jiang, Ziheng
    Ji, Lei
    Zhang, Jianwen
    Yan, Jun
    Guo, Ping
    Liu, Ning
    PROCEEDINGS OF THE 22ND ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM'13), 2013, : 2339 - 2344