MetaQA: Enhancing human-centered data search using Generative Pre-trained Transformer (GPT) language model and artificial intelligence

被引:3
|
作者
Li, Diya [1 ]
Zhang, Zhe [1 ,2 ]
机构
[1] Texas A&M Univ, Dept Geog, College Stn, TX 77843 USA
[2] Texas A&M Univ, Dept Elect & Comp Engn, College Stn, TX 77843 USA
来源
PLOS ONE | 2023年 / 18卷 / 11期
关键词
D O I
10.1371/journal.pone.0293034
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Accessing and utilizing geospatial data from various sources is essential for developing scientific research to address complex scientific and societal challenges that require interdisciplinary knowledge. The traditional keyword-based geosearch approach is insufficient due to the uncertainty inherent within spatial information and how it is presented in the data-sharing platform. For instance, the Gulf of Mexico Coastal Ocean Observing System (GCOOS) data search platform stores geoinformation and metadata in a complex tabular. Users can search for data by entering keywords or selecting data from a drop-down manual from the user interface. However, the search results provide limited information about the data product, where detailed descriptions, potential use, and relationship with other data products are still missing. Language models (LMs) have demonstrated great potential in tasks like question answering, sentiment analysis, text classification, and machine translation. However, they struggle when dealing with metadata represented in tabular format. To overcome these challenges, we developed Meta Question Answering System (MetaQA), a novel spatial data search model. MetaQA integrates end-to-end AI models with a generative pre-trained transformer (GPT) to enhance geosearch services. Using GCOOS metadata as a case study, we tested the effectiveness of MetaQA. The results revealed that MetaQA outperforms state-of-the-art question-answering models in handling tabular metadata, underlining its potential for user-inspired geosearch services.
引用
收藏
页数:20
相关论文
共 48 条
  • [1] GPT-NAS: Neural Architecture Search Meets Generative Pre-Trained Transformer Model
    Yu, Caiyang
    Liu, Xianggen
    Wang, Yifan
    Liu, Yun
    Feng, Wentao
    Deng, Xiong
    Tang, Chenwei
    Lv, Jiancheng
    BIG DATA MINING AND ANALYTICS, 2025, 8 (01): : 45 - 64
  • [2] Generative Pre-Trained Transformer (GPT) in Research: A Systematic Review on Data Augmentation
    Sufi, Fahim
    INFORMATION, 2024, 15 (02)
  • [3] CAN ARTIFICIAL INTELLIGENCE (AI) LARGE LANGUAGE MODELS (LLMS) SUCH AS GENERATIVE PRE-TRAINED TRANSFORMER (GPT) BE USED TO AUTOMATE LITERATURE REVIEWS?
    Guerra, I
    Gallinaro, J.
    Rtveladze, K.
    Lambova, A.
    Asenova, E.
    VALUE IN HEALTH, 2023, 26 (12) : S410 - S411
  • [4] Enhancing rumor detection with data augmentation and generative pre-trained transformer
    Askarizade, Mojgan
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 262
  • [5] ShellGPT: Generative Pre-trained Transformer Model for Shell Language Understanding
    Shi, Jie
    Jiang, Sihang
    Xu, Bo
    Liang, Jiaqing
    Xiao, Yanghua
    Wang, Wei
    2023 IEEE 34TH INTERNATIONAL SYMPOSIUM ON SOFTWARE RELIABILITY ENGINEERING, ISSRE, 2023, : 671 - 682
  • [6] Evolutionary Game Analysis of Artificial Intelligence Such as the Generative Pre-Trained Transformer in Future Education
    You, Yanwei
    Chen, Yuquan
    You, Yujun
    Zhang, Qi
    Cao, Qiang
    SUSTAINABILITY, 2023, 15 (12)
  • [7] Artificial intelligence tools in medical education beyond Chat Generative Pre-trained Transformer (ChatGPT)
    Tan, Li Feng
    Ng, Isaac K. S.
    Teo, Desmond
    POSTGRADUATE MEDICAL JOURNAL, 2024, 100 (1187) : 697 - 698
  • [8] Generative Pre-trained Transformer (GPT) based model with relative attention for de novo drug design
    Haroon, Suhail
    Hafsath, C. A.
    Jereesh, A. S.
    COMPUTATIONAL BIOLOGY AND CHEMISTRY, 2023, 106
  • [9] HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer
    Xu, Xiaopeng
    Xu, Chencheng
    He, Wenjia
    Wei, Lesong
    Li, Haoyang
    Zhou, Juexiao
    Zhang, Ruochi
    Wang, Yu
    Xiong, Yuanpeng
    Gao, Xin
    BIOINFORMATICS, 2024, 40 (06)
  • [10] Is generative pre-trained transformer artificial intelligence (Chat-GPT) a reliable tool for guidelines synthesis? A preliminary evaluation for biologic CRSwNP therapy
    Maniaci, Antonino
    Saibene, Alberto Maria
    Calvo-Henriquez, Christian
    Vaira, Luigi
    Radulesco, Thomas
    Michel, Justin
    Chiesa-Estomba, Carlos
    Sowerby, Leigh
    Lobo Duro, David
    Mayo-Yanez, Miguel
    Maza-Solano, Juan
    Lechien, Jerome Rene
    La Mantia, Ignazio
    Cocuzza, Salvatore
    EUROPEAN ARCHIVES OF OTO-RHINO-LARYNGOLOGY, 2024, 281 (04) : 2167 - 2173