Quantum space-efficient large language models for Prolog query translation

被引:0
|
作者
Ahmed, Roshan [1 ]
Sridevi, S. [2 ]
机构
[1] Vellore Inst Technol, Sch Comp Sci & Engn, Dept AI & Robot, Chennai 600127, Tamil Nadu, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Chennai 600127, Tamil Nadu, India
关键词
Word2Vec; Large language model; Generative AI; Quantum computing; Quantum machine learning; Transfer learning; Prolog;
D O I
10.1007/s11128-024-04559-8
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
As large language models (LLMs) continue to expand in complexity, their size follows an exponential increase following Moore's law. However, implementing such complex tasks with LLMs poses a significant challenge, as classical computers may lack the necessary space to run or store the model parameters. In this context leveraging the principles of hybrid quantum machine learning for language models offers a promising solution to mitigate this issue by reducing storage space for model parameters. Although pure quantum language models have demonstrated success in recent past, they are constrained by limited features and availability. In this research we propose the DeepKet model an approach with a quantum embedding layer, which utilizes the Hilbert space generated by quantum entanglement to store feature vectors, leading to a significant reduction in size. The experimental analysis evaluates the performance of open-source pre-trained models like Microsoft Phi and CodeGen, specifically fine-tuned for generating Prolog code for geo-spatial data retrieval. Furthermore, it investigates the effectiveness of quantum DeepKet embedding layers by comparing them with the total parameter count of traditional models.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Space-Efficient Representation of Entity-centric Query Language Models
    Van Gysel, Christophe
    Hannemann, Mirko
    Pusateri, Ernest
    Oualil, Youssef
    Oparin, Ilya
    INTERSPEECH 2022, 2022, : 679 - 683
  • [2] End-to-End Space-Efficient Pipeline for Natural Language Query based Spacecraft Health Data Analytics using Large Language Model (LLM)
    Ram, Gummuluri Venkata Ravi
    Ashinee, Kesanam
    Kumar, M. Anand
    2024 5TH INTERNATIONAL CONFERENCE ON INNOVATIVE TRENDS IN INFORMATION TECHNOLOGY, ICITIIT 2024, 2024,
  • [3] PipeFilter: Parallelizable and Space-Efficient Filter for Approximate Membership Query
    Ji, Shankui
    Du, Yang
    Huang, He
    Sun, Yu-E
    Liu, Jia
    Shu, Yapeng
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2025, 37 (05) : 2816 - 2830
  • [4] Space-efficient Query Evaluation over Probabilistic Event Streams
    Alur, Rajeev
    Chen, Yu
    Jothimurugan, Kishor
    Khanna, Sanjeev
    PROCEEDINGS OF THE 35TH ANNUAL ACM/IEEE SYMPOSIUM ON LOGIC IN COMPUTER SCIENCE (LICS 2020), 2020, : 74 - 87
  • [5] Space-efficient flash translation layer for CompactFlash systems
    Kim, J
    Kim, JM
    Noh, SH
    Min, SL
    Cho, Y
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2002, 48 (02) : 366 - 375
  • [6] Space-efficient binary optimization for variational quantum computing
    Glos, Adam
    Krawiec, Aleksandra
    Zimboras, Zoltan
    NPJ QUANTUM INFORMATION, 2022, 8 (01)
  • [7] Space-efficient binary optimization for variational quantum computing
    Adam Glos
    Aleksandra Krawiec
    Zoltán Zimborás
    npj Quantum Information, 8
  • [8] Space-Efficient and Noise-Robust Quantum Factoring
    Ragavan, Seyoon
    Vaikuntanathan, Vinod
    ADVANCES IN CRYPTOLOGY - CRYPTO 2024, PT VI, 2024, 14925 : 107 - 140
  • [9] A space-efficient algorithm for aligning large genomic sequences
    Morgenstern, B
    BIOINFORMATICS, 2000, 16 (10) : 948 - 949
  • [10] PairwiseHist: Fast, Accurate and Space-Efficient Approximate Query Processing with Data Compression
    Hurst, Aaron
    Lucani, Daniel E.
    Zhang, Qi
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2024, 17 (06): : 1432 - 1445