Bug-Transformer: Automated Program Repair Using Attention-Based Deep Neural Network

被引:3
|
作者
Yao, Jie [1 ]
Rao, Bingbing [2 ]
Xing, Weiwei [1 ]
Wang, Liqiang [2 ]
机构
[1] Beijing Jiaotong Univ, Sch Software Engn, Beijing 100044, Peoples R China
[2] Univ Cent Florida, Dept Comp Sci, Orlando, FL 32816 USA
关键词
Neural Language Models; Context Abstraction; Attention Mechanism; Transformer; Automated Program Repair;
D O I
10.1142/S0218126622502103
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a novel transformer-based deep neural network model to learn semantic bug patterns from a corpus of buggy/fixed codes, then generate correct ones automatically. Transformer is a deep learning model relying entirely on attention mechanism to model global dependencies between input and output. Although there are a few endeavors to repair programs by learning neural language models (NLM), many special program properties, such as structure and semantics of an identifier, are not considered in embedding input sequence and designing model effectively, which results in undesired performance. In the proposed Bug-Transformer, we design a novel context abstraction mechanism to better support neural language models. Specifically, it is capable of 1) compressing code information but preserving the key structure and semantics, which provides more thorough information for NLM models, 2) renaming identifiers and literals based on their lexical scopes, structural and semantic information, to reduce code vocabulary size and 3) reserving keywords and selected idioms (domain- or developer-specific vocabularies) for better understanding code structure and semantics. Hence, Bug-Transformer adequately embeds code structural and semantic information into input data and optimize attention-based transformer neural network to well handle code features in order to improve learning tasks for bug repair. We evaluate the performance of the proposed work comprehensively on three datasets (Java code corpora) and generate patches to buggy code using a beam search decoder. The experimental results show that our proposed work outperforms the-state-of-art techniques: Bug-Transformer can successfully predict 54.81%, 34.45%, and 42.40% of the fixed code in these three datasets, respectively, which outperform the baseline models. These success rates steadily increase along with the increase of beam size. Besides, the overall syntactic correctness of all patches remains above 97%, 96%, and 50% on the three benchmarks, respectively, regardless of the beam size.
引用
收藏
页数:26
相关论文
共 50 条
  • [1] Automated skin lesion segmentation using attention-based deep convolutional neural network
    Arora, Ridhi
    Raman, Balasubramanian
    Nayyar, Kritagya
    Awasthi, Ruchi
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2021, 65
  • [2] EEG emotion recognition using attention-based convolutional transformer neural network
    Gong, Linlin
    Li, Mingyang
    Zhang, Tao
    Chen, Wanzhong
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 84
  • [3] Retweet Prediction with Attention-based Deep Neural Network
    Zhang, Qi
    Gong, Yeyun
    Wu, Jindou
    Huang, Haoran
    Huang, Xuanjing
    CIKM'16: PROCEEDINGS OF THE 2016 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2016, : 75 - 84
  • [4] Breast Cancer Cell Segmentation Using Attention-Based Deep Neural Network
    Patra, Ankita
    Barpanda, Nalini Kanta
    Sethy, Prabira Kumar
    Das, Ashis
    Behera, Santi Kumari
    Nanda, Amlan
    Proceedings - 2023 IEEE World Conference on Applied Intelligence and Computing, AIC 2023, 2023, : 628 - 631
  • [5] Residential Appliance Detection Using Attention-based Deep Convolutional Neural Network
    Deng, Chunyu
    Wu, Kehe
    Wang, Binbin
    CSEE JOURNAL OF POWER AND ENERGY SYSTEMS, 2022, 8 (02): : 621 - 633
  • [6] Underwater acoustic target recognition using attention-based deep neural network
    Xiao, Xu
    Wang, Wenbo
    Ren, Qunyan
    Gerstoft, Peter
    Ma, Li
    JASA EXPRESS LETTERS, 2021, 1 (10):
  • [7] Anomaly Detection in Automated Vehicles Using Multistage Attention-Based Convolutional Neural Network
    Javed, Abdul Rehman
    Usman, Muhammad
    Rehman, Saif Ur
    Khan, Mohib Ullah
    Haghighi, Mohammad Sayad
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2021, 22 (07) : 4291 - 4300
  • [8] Attention-based deep neural network for driver behavior recognition
    Xiao, Weichu
    Liu, Hongli
    Ma, Ziji
    Chen, Weihong
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2022, 132 : 152 - 161
  • [9] Attention-based convolutional neural network for deep face recognition
    Hefei Ling
    Jiyang Wu
    Junrui Huang
    Jiazhong Chen
    Ping Li
    Multimedia Tools and Applications, 2020, 79 : 5595 - 5616
  • [10] Deep Attention-based Neural Network for Electricity Theft Detection
    Zhang, Yufan
    Ji, Yugang
    Xiao, Ding
    PROCEEDINGS OF 2020 IEEE 11TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING AND SERVICE SCIENCE (ICSESS 2020), 2020, : 154 - 157