Automatic Modulation Classification Based on CNN-Transformer Graph Neural Network

被引:12
|
作者
Wang, Dong [1 ,2 ]
Lin, Meiyan [1 ,2 ]
Zhang, Xiaoxu [1 ,2 ]
Huang, Yonghui [1 ]
Zhu, Yan [1 ]
机构
[1] Chinese Acad Sci, Natl Space Sci Ctr, Key Lab Elect & Informat Technol Space Syst, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
关键词
deep learning; modulation classification; graph neural network; transformer network; RECOGNITION; MODEL;
D O I
10.3390/s23167281
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
In recent years, neural network algorithms have demonstrated tremendous potential for modulation classification. Deep learning methods typically take raw signals or convert signals into time-frequency images as inputs to convolutional neural networks (CNNs) or recurrent neural networks (RNNs). However, with the advancement of graph neural networks (GNNs), a new approach has been introduced involving transforming time series data into graph structures. In this study, we propose a CNN-transformer graph neural network (CTGNet) for modulation classification, to uncover complex representations in signal data. First, we apply sliding window processing to the original signals, obtaining signal subsequences and reorganizing them into a signal subsequence matrix. Subsequently, we employ CTGNet, which adaptively maps the preprocessed signal matrices into graph structures, and utilize a graph neural network based on GraphSAGE and DMoNPool for classification. Extensive experiments demonstrated that our method outperformed advanced deep learning techniques, achieving the highest recognition accuracy. This underscores CTGNet's significant advantage in capturing key features in signal data and providing an effective solution for modulation classification tasks.
引用
收藏
页数:23
相关论文
共 50 条
  • [21] CMTNet: a hybrid CNN-transformer network for UAV-based hyperspectral crop classification in precision agriculture
    Xihong Guo
    Quan Feng
    Faxu Guo
    Scientific Reports, 15 (1)
  • [22] C-TUnet: A CNN-Transformer Architecture-Based Ultrasound Breast Image Classification Network
    Wu, Ying
    Li, Faming
    Xu, Bo
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2025, 35 (01)
  • [23] CNN-Transformer and Channel-Spatial Attention based network for hyperspectral image classification with few samples
    Fu, Chuan
    Zhou, Tianyuan
    Guo, Tan
    Zhu, Qikui
    Luo, Fulin
    Du, Bo
    NEURAL NETWORKS, 2025, 186
  • [24] TranSenseFusers: A temporal CNN-Transformer neural network family for explainable PPG-based stress detection
    Kasnesis, Panagiotis
    Chatzigeorgiou, Christos
    Feidakis, Michalis
    Gutierrez, Alvaro
    Patrikakis, Charalampos Z.
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2025, 102
  • [25] TACT: Text attention based CNN-Transformer network for polyp segmentation
    Zhao, Yiyang
    Li, Jinjiang
    Hua, Zhen
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2024, 34 (02)
  • [26] Hybrid CNN-transformer network for efficient CSI feedback
    Zhao, Ruohan
    Liu, Ziang
    Song, Tianyu
    Jin, Jiyu
    Jin, Guiyue
    Fan, Lei
    PHYSICAL COMMUNICATION, 2024, 66
  • [27] CNN-Transformer based emotion classification from facial expressions and body gestures
    Karatay, Busra
    Bestepe, Deniz
    Sailunaz, Kashfia
    oezyer, Tansel
    Alhajj, Reda
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (8) : 23129 - 23171
  • [28] Image harmonization with Simple Hybrid CNN-Transformer Network
    Li, Guanlin
    Zhao, Bin
    Li, Xuelong
    NEURAL NETWORKS, 2024, 180
  • [29] Automatic Modulation Classification Based on Bispectrum and CNN
    Li, Yongbin
    Shao, Gaoping
    Wang, Bin
    PROCEEDINGS OF 2019 IEEE 8TH JOINT INTERNATIONAL INFORMATION TECHNOLOGY AND ARTIFICIAL INTELLIGENCE CONFERENCE (ITAIC 2019), 2019, : 311 - 316
  • [30] CNN-Transformer based emotion classification from facial expressions and body gestures
    Buşra Karatay
    Deniz Beştepe
    Kashfia Sailunaz
    Tansel Özyer
    Reda Alhajj
    Multimedia Tools and Applications, 2024, 83 : 23129 - 23171