Transformer-Based Detector for OFDM With Index Modulation

被引:8
|
作者
Zhang, Dexin [1 ]
Wang, Sixian [1 ]
Niu, Kai [1 ]
Dai, Jincheng [1 ]
Wang, Sen [2 ]
Yuan, Yifei [2 ]
机构
[1] Beijing Univ Posts & Telecommun BUPT, Key Lab Universal Wireless Commun, Minist Educ, Beijing 100876, Peoples R China
[2] China Mobile Res Inst CMRI, Beijing 100053, Peoples R China
基金
中国国家自然科学基金;
关键词
Detectors; Transformers; Indexes; Neural networks; OFDM; Modulation; Feature extraction; Index modulation (IM) detector; deep learning (DL); transformer; self-attention mechanism; LEARNING-BASED DETECTOR;
D O I
10.1109/LCOMM.2022.3158734
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
A deep learning (DL)-based detector utilizing the Transformer framework is proposed for orthogonal frequency-division multiplexing with index modulation (OFDM-IM) systems, termed as TransIM. Concretely, TransIM adopts a two-step detection method. First, the neural networks with the Transformer block as the core provide soft probabilities of different transmitted symbols. Then, conventional signal detection methods are performed based on those probabilities to make final decisions. This method is verified to improve system error performance significantly, albeit at the cost of slightly increased complexity. Simulation results indicate that the proposed TransIM detector fares better than existing DL-based ones regarding bit error rate (BER) performance.
引用
收藏
页码:1313 / 1317
页数:5
相关论文
共 50 条
  • [41] Transformer-based Arabic Dialect Identification
    Lin, Wanqiu
    Madhavi, Maulik
    Das, Rohan Kumar
    Li, Haizhou
    2020 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP 2020), 2020, : 192 - 196
  • [42] A Transformer-Based GAN for Anomaly Detection
    Yang, Caiyin
    Lan, Shiyong
    Huangl, Weikang
    Wang, Wenwu
    Liul, Guoliang
    Yang, Hongyu
    Ma, Wei
    Li, Piaoyang
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT II, 2022, 13530 : 345 - 357
  • [43] Transformer-based Planning for Symbolic Regression
    Shojaee, Parshin
    Meidani, Kazem
    Farimani, Amir Barati
    Reddy, Chandan K.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [44] Transformer-based ripeness segmentation for tomatoes
    Shinoda, Risa
    Kataoka, Hirokatsu
    Hara, Kensho
    Noguchi, Ryozo
    SMART AGRICULTURAL TECHNOLOGY, 2023, 4
  • [45] A transformer-based network for speech recognition
    Tang L.
    International Journal of Speech Technology, 2023, 26 (02) : 531 - 539
  • [46] EEG Classification with Transformer-Based Models
    Sun, Jiayao
    Xie, Jin
    Zhou, Huihui
    2021 IEEE 3RD GLOBAL CONFERENCE ON LIFE SCIENCES AND TECHNOLOGIES (IEEE LIFETECH 2021), 2021, : 92 - 93
  • [47] Transformer-based LLMs for Sensor Data
    Okita, Tsuyoshi
    Ukita, Kosuke
    Matsuishi, Koki
    Kagiyama, Masaharu
    Hirata, Kodai
    Miyazaki, Asahi
    ADJUNCT PROCEEDINGS OF THE 2023 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING & THE 2023 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTING, UBICOMP/ISWC 2023 ADJUNCT, 2023, : 499 - 504
  • [48] Transformer-based approach to variable typing
    Rey, Charles Arthel
    Danguilan, Jose Lorenzo
    Mendoza, Karl Patrick
    Remolona, Miguel Francisco
    HELIYON, 2023, 9 (10)
  • [49] Transformer-Based Video Deinterlacing Method
    Song, Chao
    Li, Haidong
    Zheng, Dong
    Wang, Jie
    Jiang, Zhaoyi
    Yang, Bailin
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT V, 2024, 14451 : 357 - 369
  • [50] Swin transformer-based supervised hashing
    Liangkang Peng
    Jiangbo Qian
    Chong Wang
    Baisong Liu
    Yihong Dong
    Applied Intelligence, 2023, 53 : 17548 - 17560