Quantum space-efficient large language models for Prolog query translation

被引:0
|
作者
Ahmed, Roshan [1 ]
Sridevi, S. [2 ]
机构
[1] Vellore Inst Technol, Sch Comp Sci & Engn, Dept AI & Robot, Chennai 600127, Tamil Nadu, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Chennai 600127, Tamil Nadu, India
关键词
Word2Vec; Large language model; Generative AI; Quantum computing; Quantum machine learning; Transfer learning; Prolog;
D O I
10.1007/s11128-024-04559-8
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
As large language models (LLMs) continue to expand in complexity, their size follows an exponential increase following Moore's law. However, implementing such complex tasks with LLMs poses a significant challenge, as classical computers may lack the necessary space to run or store the model parameters. In this context leveraging the principles of hybrid quantum machine learning for language models offers a promising solution to mitigate this issue by reducing storage space for model parameters. Although pure quantum language models have demonstrated success in recent past, they are constrained by limited features and availability. In this research we propose the DeepKet model an approach with a quantum embedding layer, which utilizes the Hilbert space generated by quantum entanglement to store feature vectors, leading to a significant reduction in size. The experimental analysis evaluates the performance of open-source pre-trained models like Microsoft Phi and CodeGen, specifically fine-tuned for generating Prolog code for geo-spatial data retrieval. Furthermore, it investigates the effectiveness of quantum DeepKet embedding layers by comparing them with the total parameter count of traditional models.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] Improving Emulation of Quantum Algorithms using Space-Efficient Hardware Architectures
    Mahmud, Naveed
    El-Araby, Esam
    2019 IEEE 30TH INTERNATIONAL CONFERENCE ON APPLICATION-SPECIFIC SYSTEMS, ARCHITECTURES AND PROCESSORS (ASAP 2019), 2019, : 206 - 213
  • [22] Statistical query translation models for cross-language information retrieval
    Microsoft Research
    不详
    不详
    不详
    不详
    ACM Trans. Asian Lang. Inf. Process., 2006, 4 (323-359): : 323 - 359
  • [23] Space-efficient computation of k-mer dictionaries for large values of k
    Diaz-Dominguez, Diego
    Leinonen, Miika
    Salmela, Leena
    ALGORITHMS FOR MOLECULAR BIOLOGY, 2024, 19 (01)
  • [24] Transferable adversarial distribution learning: Query-efficient adversarial attack against large language models
    Dong, Huoyuan
    Dong, Jialiang
    Wan, Shaohua
    Yuan, Shuai
    Guan, Zhitao
    COMPUTERS & SECURITY, 2023, 135
  • [25] Impact behaviour and design models for space-efficient ring-shear structures
    Wang, Genda
    Lu, Zhaijun
    Jiao, Peng
    Liu, Jiefu
    Chen, Zhiping
    MATERIALS TODAY COMMUNICATIONS, 2024, 41
  • [26] Unfreeze with Care: Space-Efficient Fine-Tuning of Semantic Parsing Models
    Sun, Weiqi
    Khan, Haidar
    des Mesnards, Nicolas Guenon
    Rubino, Melanie
    Arkoudas, Konstantine
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 999 - 1007
  • [27] Space-efficient computation of k-mer dictionaries for large values of k
    Diego Díaz-Domínguez
    Miika Leinonen
    Leena Salmela
    Algorithms for Molecular Biology, 19
  • [28] Corpus-Steered Query Expansion with Large Language Models
    Lei, Yibin
    Cao, Yu
    Zhou, Tianyi
    Shen, Tao
    Yates, Andrew
    PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 393 - 401
  • [29] Query Rewriting for Retrieval-Augmented Large Language Models
    Ma, Xinbei
    Gong, Yeyun
    He, Pengcheng
    Zhao, Hai
    Duan, Nan
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 5303 - 5315
  • [30] Converting Continuous-Space Language Models into N-gram Language Models with Efficient Bilingual Pruning for Statistical Machine Translation
    Wang, Rui
    Utiyama, Masao
    Goto, Isao
    Sumita, Eiichiro
    Zhao, Hai
    Lu, Bao-Liang
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2016, 15 (03)