ECC-BERT: Classification of error correcting codes using the improved bidirectional encoder representation from transformers

被引:2
|
作者
Li, Sida [1 ]
Hu, Xiaochang [1 ]
Huang, Zhiping [1 ]
Zhou, Jing [1 ]
机构
[1] Natl Univ Def Technol, Coll Intelligent Sci, Deya St 109, Changsha 410070, Hunan, Peoples R China
关键词
RECONSTRUCTION;
D O I
10.1049/cmu2.12357
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The recent concept of contextual information in error correcting code (ECC) can significantly improve the capacity of the blind recognition of codes with deep learning (DL) approaches. However, the fundamental challenges of existing DL-based methods are inflexible structure and limited kernel size which bring great difficulties to exploit the characteristics of contextual information in ECC. To handle this problem, in this paper, a state-of-the-art framework for natural language processing (NLP), bidirectional encoder representation from transformers (BERT), is utilized in ECC classification scenarios. To strengthen the effectiveness of contextual information, the BERT model is improved by weighted relative positional encoding and error bit embedding. The proposed approach achieves higher classification accuracy than the methods based on Gauss-Jordan elimination and traditional deep learning schemes. Further simulation results show that the classification accuracy is affected by block length and the employment of weighted relative positional encoding and error bit embedding to a large extent.
引用
收藏
页码:359 / 368
页数:10
相关论文
共 50 条
  • [1] HSI-BERT: Hyperspectral Image Classification Using the Bidirectional Encoder Representation From Transformers
    He, Ji
    Zhao, Lina
    Yang, Hongwei
    Zhang, Mengmeng
    Li, Wei
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (01): : 165 - 178
  • [2] Protein Sequence Classification Using Bidirectional Encoder Representations from Transformers (BERT) Approach
    Balamurugan R.
    Mohite S.
    Raja S.P.
    SN Computer Science, 4 (5)
  • [3] Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
    Areshey, Ali
    Mathkour, Hassan
    SENSORS, 2023, 23 (11)
  • [4] Cyberbullying Detection Using Bidirectional Encoder Representations from Transformers (BERT)
    Sujud, Razan
    Fahs, Walid
    Khatoun, Rida
    Chbib, Fadlallah
    2024 IEEE INTERNATIONAL MEDITERRANEAN CONFERENCE ON COMMUNICATIONS AND NETWORKING, MEDITCOM 2024, 2024, : 257 - 262
  • [5] TraumaICD Bidirectional Encoder Representation From Transformers
    Choi, Jeff
    Chen, Yifu
    Sivura, Alexander
    Vendrow, Edward B.
    Wang, Jenny
    Spain, David A.
    ANNALS OF SURGERY, 2024, 280 (01) : 150 - 155
  • [6] On the Dependability of Bidirectional Encoder Representations from Transformers (BERT) to Soft Errors
    Gao, Zhen
    Yin, Ziye
    Wang, Jingyan
    Su, Rui
    Deng, Jie
    Liu, Qiang
    Reviriego, Pedro
    Liu, Shanshan
    Lombardi, Fabrizio
    IEEE TRANSACTIONS ON NANOTECHNOLOGY, 2025, 24 : 73 - 87
  • [7] Identification of Misogyny on Social Media in Indonesian Using Bidirectional Encoder Representations From Transformers (BERT)
    Wibowo, Bagas Tri
    Nurjanah, Dade
    Nurrahmi, Hani
    2023 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE IN INFORMATION AND COMMUNICATION, ICAIIC, 2023, : 401 - 406
  • [8] Oversampling effect in pretraining for bidirectional encoder representations from transformers (BERT) to localize medical BERT and enhance biomedical BERT
    Wada, Shoya
    Takeda, Toshihiro
    Okada, Katsuki
    Manabe, Shirou
    Konishi, Shozo
    Kamohara, Jun
    Matsumura, Yasushi
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2024, 153
  • [9] ClassifAI: Automating Issue Reports Classification using Pre-Trained BERT (Bidirectional Encoder Representations from Transformers) Language Models
    Alam, Khubaib Amjad
    Jumani, Ashish
    Aamir, Harris
    Uzair, Muhammad
    PROCEEDINGS 2024 ACM/IEEE INTERNATIONAL WORKSHOP ON NL-BASED SOFTWARE ENGINEERING, NLBSE 2024, 2024, : 49 - 52
  • [10] Analysis of Government Policy Sentiment Regarding Vacation during the COVID-19 Pandemic Using the Bidirectional Encoder Representation from Transformers (BERT)
    Yulita, Intan Nurma
    Wijaya, Victor
    Rosadi, Rudi
    Sarathan, Indra
    Djuyandi, Yusa
    Prabuwono, Anton Satria
    DATA, 2023, 8 (03)