Towards JavaScript program repair with Generative Pre-trained Transformer (GPT-2)

被引:0
|
作者
Lajko, Mark [1 ]
Csuvik, Viktor [1 ]
Vidacs, Laszlo [1 ]
机构
[1] University of Szeged, MTA-SZTE Research Group on Artificial Intelligence, Department of Software Engineering, Szeged, Hungary
关键词
Compendex;
D O I
3rd IEEE/ACM International Workshop on Automated Program Repair, APR 2022
中图分类号
学科分类号
摘要
Deep learning
引用
收藏
页码:61 / 68
相关论文
共 50 条
  • [1] Towards Java']JavaScript program repair with Generative Pre-trained Transformer (GPT-2)
    Lajko, Mark
    Csuvik, Viktor
    Vidacs, Laszlo
    INTERNATIONAL WORKSHOP ON AUTOMATED PROGRAM REPAIR (APR 2022), 2022, : 61 - 68
  • [2] Generative pre-trained transformer (GPT)-4 support for differential diagnosis in neuroradiology
    Sorin, Vera
    Klang, Eyal
    Sobeh, Tamer
    Konen, Eli
    Shrot, Shai
    Livne, Adva
    Weissbuch, Yulian
    Hoffmann, Chen
    Barash, Yiftach
    QUANTITATIVE IMAGING IN MEDICINE AND SURGERY, 2024, 14 (10)
  • [3] Generative Pre-trained Transformer 4 (GPT-4) in clinical settings
    Bellini, Valentina
    Bignami, Elena Giovanna
    LANCET DIGITAL HEALTH, 2025, 7 (01): : e6 - e7
  • [4] Generative Pre-Trained Transformer (GPT) in Research: A Systematic Review on Data Augmentation
    Sufi, Fahim
    INFORMATION, 2024, 15 (02)
  • [5] Generative pre-trained transformers (GPT) for surface engineering
    Kamnis, Spyros
    SURFACE & COATINGS TECHNOLOGY, 2023, 466
  • [6] GPT-NAS: Neural Architecture Search Meets Generative Pre-Trained Transformer Model
    Yu, Caiyang
    Liu, Xianggen
    Wang, Yifan
    Liu, Yun
    Feng, Wentao
    Deng, Xiong
    Tang, Chenwei
    Lv, Jiancheng
    BIG DATA MINING AND ANALYTICS, 2025, 8 (01): : 45 - 64
  • [7] GPT-LS: Generative Pre-Trained Transformer with Offline Reinforcement Learning for Logic Synthesis
    Lv, Chenyang
    Wei, Ziling
    Qian, Weikang
    Ye, Junjie
    Feng, Chang
    He, Zhezhi
    2023 IEEE 41ST INTERNATIONAL CONFERENCE ON COMPUTER DESIGN, ICCD, 2023, : 320 - 326
  • [8] GPT2MVS: Generative Pre-trained Transformer-2 for Multi-modal Video Summarization
    Huang, Jia-Hong
    Murn, Luka
    Mrak, Marta
    Worring, Marcel
    PROCEEDINGS OF THE 2021 INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL (ICMR '21), 2021, : 580 - 589
  • [9] HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer
    Xu, Xiaopeng
    Xu, Chencheng
    He, Wenjia
    Wei, Lesong
    Li, Haoyang
    Zhou, Juexiao
    Zhang, Ruochi
    Wang, Yu
    Xiong, Yuanpeng
    Gao, Xin
    BIOINFORMATICS, 2024, 40 (06)
  • [10] Considering the possibilities and pitfalls of Generative Pre-trained Transformer 3 (GPT-3) in healthcare delivery
    Diane M. Korngiebel
    Sean D. Mooney
    npj Digital Medicine, 4