Quantum space-efficient large language models for Prolog query translation

被引:0
|
作者
Ahmed, Roshan [1 ]
Sridevi, S. [2 ]
机构
[1] Vellore Inst Technol, Sch Comp Sci & Engn, Dept AI & Robot, Chennai 600127, Tamil Nadu, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Chennai 600127, Tamil Nadu, India
关键词
Word2Vec; Large language model; Generative AI; Quantum computing; Quantum machine learning; Transfer learning; Prolog;
D O I
10.1007/s11128-024-04559-8
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
As large language models (LLMs) continue to expand in complexity, their size follows an exponential increase following Moore's law. However, implementing such complex tasks with LLMs poses a significant challenge, as classical computers may lack the necessary space to run or store the model parameters. In this context leveraging the principles of hybrid quantum machine learning for language models offers a promising solution to mitigate this issue by reducing storage space for model parameters. Although pure quantum language models have demonstrated success in recent past, they are constrained by limited features and availability. In this research we propose the DeepKet model an approach with a quantum embedding layer, which utilizes the Hilbert space generated by quantum entanglement to store feature vectors, leading to a significant reduction in size. The experimental analysis evaluates the performance of open-source pre-trained models like Microsoft Phi and CodeGen, specifically fine-tuned for generating Prolog code for geo-spatial data retrieval. Furthermore, it investigates the effectiveness of quantum DeepKet embedding layers by comparing them with the total parameter count of traditional models.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] Needle: a fast and space-efficient prefilter for estimating the quantification of very large collections of expression experiments
    Darvish, Mitra
    Seiler, Enrico
    Mehringer, Svenja
    Rahn, Rene
    Reinert, Knut
    BIOINFORMATICS, 2022, 38 (17) : 4100 - 4108
  • [42] Recent Advances in Interactive Machine Translation With Large Language Models
    Wang, Yanshu
    Zhang, Jinyi
    Shi, Tianrong
    Deng, Dashuai
    Tian, Ye
    Matsumoto, Tadahiro
    IEEE ACCESS, 2024, 12 : 179353 - 179382
  • [43] Document-Level Machine Translation with Large Language Models
    Wang, Longyue
    Lyu, Chenyang
    Ji, Tianbo
    Zhang, Zhirui
    Yu, Dian
    Shi, Shuming
    Tu, Zhaopeng
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 16646 - 16661
  • [44] An Empirical Study of Translation Hypothesis Ensembling with Large Language Models
    Farinhas, Antonio
    de Souza, Jose G. C.
    Martinsi, Andre F. T.
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 11956 - 11970
  • [45] Enhancing Translation Validation of Compiler Transformations with Large Language Models
    Wang, Yanzhao
    Xie, Fei
    INTERNATIONAL JOURNAL OF SOFTWARE ENGINEERING AND KNOWLEDGE ENGINEERING, 2025, 35 (01) : 45 - 57
  • [46] A fast and space-efficient boundary element method for computing electrostatic and hydration effects in large molecules
    Tripos, Inc., 1699 S. Hanley Road, St. Louis, MO 63144, United States
    不详
    J. Comput. Chem., 7 (864-877):
  • [47] A fast and space-efficient boundary element method for computing electrostatic and hydration effects in large molecules
    Zauhar, RJ
    Varnek, A
    JOURNAL OF COMPUTATIONAL CHEMISTRY, 1996, 17 (07) : 864 - 877
  • [48] Eliciting the Translation Ability of Large Language Models via Multilingual Finetuning with Translation Instructions
    Li, Jiahuan
    Zhou, Hao
    Huang, Shujian
    Cheng, Shanbo
    Chen, Jiajun
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2024, 12 : 576 - 592
  • [49] Boosting legal case retrieval by query content selection with large language models
    Zhou, Youchao
    Huang, Heyan
    Wu, Zhijing
    ANNUAL INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL IN THE ASIA PACIFIC REGION, SIGIR-AP 2023, 2023, : 176 - 184
  • [50] Application of large language models to quantum state simulation
    Zhou, Shuangxiang
    Chen, Ronghang
    An, Zheng
    Zhang, Chao
    Hou, Shi-Yao
    SCIENCE CHINA-PHYSICS MECHANICS & ASTRONOMY, 2025, 68 (04)