Quantum space-efficient large language models for Prolog query translation

被引:0
|
作者
Ahmed, Roshan [1 ]
Sridevi, S. [2 ]
机构
[1] Vellore Inst Technol, Sch Comp Sci & Engn, Dept AI & Robot, Chennai 600127, Tamil Nadu, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Chennai 600127, Tamil Nadu, India
关键词
Word2Vec; Large language model; Generative AI; Quantum computing; Quantum machine learning; Transfer learning; Prolog;
D O I
10.1007/s11128-024-04559-8
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
As large language models (LLMs) continue to expand in complexity, their size follows an exponential increase following Moore's law. However, implementing such complex tasks with LLMs poses a significant challenge, as classical computers may lack the necessary space to run or store the model parameters. In this context leveraging the principles of hybrid quantum machine learning for language models offers a promising solution to mitigate this issue by reducing storage space for model parameters. Although pure quantum language models have demonstrated success in recent past, they are constrained by limited features and availability. In this research we propose the DeepKet model an approach with a quantum embedding layer, which utilizes the Hilbert space generated by quantum entanglement to store feature vectors, leading to a significant reduction in size. The experimental analysis evaluates the performance of open-source pre-trained models like Microsoft Phi and CodeGen, specifically fine-tuned for generating Prolog code for geo-spatial data retrieval. Furthermore, it investigates the effectiveness of quantum DeepKet embedding layers by comparing them with the total parameter count of traditional models.
引用
收藏
页数:20
相关论文
共 50 条
  • [31] Efficient Machine Translation Decoding with Slow Language Models
    Emami, Ahmad
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 2376 - 2379
  • [32] A space-efficient quantum computer simulator suitable for high-speed FPGA implementation
    Frank, Michael P.
    Oniciuc, Liviu
    Meyer-Baese, Uwe H.
    Chiorescu, Irinel
    QUANTUM INFORMATION AND COMPUTATION VII, 2009, 7342
  • [33] Toffoli gate count optimized space-efficient quantum circuit for binary field multiplication
    Kim, Sunyeop
    Kim, Insung
    Kim, Seonggyeom
    Hong, Seokhie
    QUANTUM INFORMATION PROCESSING, 2024, 23 (10)
  • [34] Exploring Large Language Models in Intent Acquisition and Translation
    Fontana, Mattia
    Martini, Barbara
    Sciarrone, Filippo
    2024 IEEE 10TH INTERNATIONAL CONFERENCE ON NETWORK SOFTWARIZATION, NETSOFT 2024, 2024, : 231 - 234
  • [35] Improving Machine Translation Formality with Large Language Models
    Yang, Murun
    Li, Fuxue
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (02): : 2061 - 2075
  • [36] LEVERAGING LARGE LANGUAGE MODELS WITH VOCABULARY SHARING FOR SIGN LANGUAGE TRANSLATION
    Lee, Huije
    Kim, Jung-Ho
    Hwang, Eui Jun
    Kim, Jaewoo
    Park, Jong C.
    2023 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW, 2023,
  • [37] Enhancing Interactive Image Retrieval With Query Rewriting Using Large Language Models and Vision Language Models
    Zhu, Hongyi
    Huang, Jia-Hong
    Rudinac, Stevan
    Kanoulas, Evangelos
    PROCEEDINGS OF THE 4TH ANNUAL ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2024, 2024, : 978 - 987
  • [38] Synthetic Query Generation using Large Language Models for Virtual Assistants
    Sannigrahi, Sonal
    Fraga-Silva, Thiago
    Oualil, Youssef
    Van Gysel, Christophe
    PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 2837 - 2841
  • [39] New space-efficient quantum algorithm for binary elliptic curves using the optimized division algorithm
    Kim, Hyeonhak
    Hong, Seokhie
    QUANTUM INFORMATION PROCESSING, 2023, 22 (06)
  • [40] New space-efficient quantum algorithm for binary elliptic curves using the optimized division algorithm
    Hyeonhak Kim
    Seokhie Hong
    Quantum Information Processing, 22