Always be Pre-Training: Representation Learning for Network Intrusion Detection with GNNs

被引:1
|
作者
Gu, Zhengyao [1 ]
Lopez, Diego Troy [2 ]
Alrahis, Lilas [3 ]
Sinanoglu, Ozgur [3 ]
机构
[1] NYU, Ctr Data Sci, New York, NY 10012 USA
[2] NYU, Res Technol Serv, New York, NY USA
[3] New York Univ Abu Dhabi, Abu Dhabi, U Arab Emirates
关键词
Intrusion detection; machine learning; graph neural network; NIDS; few-shot learning; self-supervised learning; INTERNET; THINGS; ATTACK; IOT;
D O I
10.1109/ISQED60706.2024.10528371
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Graph neural network-based network intrusion detection systems have recently demonstrated state-of-the-art performance on benchmark datasets. Nevertheless, these methods suffer from a reliance on target encoding for data pre-processing, limiting widespread adoption due to the associated need for annotated labels-a cost-prohibitive requirement. In this work, we propose a solution involving in-context pre-training and the utilization of dense representations for categorical features to jointly overcome the label-dependency limitation. Our approach exhibits remarkable data efficiency, achieving over 98% of the performance of the supervised state-of-the-art with less than 4% labeled data on the NF-UQ-NIDS-V2 dataset.
引用
收藏
页数:8
相关论文
共 50 条
  • [41] SIMLM: Pre-training with Representation Bottleneck for Dense Passage Retrieval
    Wang, Liang
    Yang, Nan
    Huang, Xiaolong
    Jiao, Binxing
    Yang, Linjun
    Jiang, Daxin
    Majumder, Rangan
    Wei, Furu
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 2244 - 2258
  • [42] PreLAR: World Model Pre-training with Learnable Action Representation
    Zhang, Lixuan
    Kan, Meina
    Shan, Shiguang
    Chen, Xilin
    COMPUTER VISION - ECCV 2024, PT XXIII, 2025, 15081 : 185 - 201
  • [43] Layer-wise Pre-training Mechanism Based on Neural Network for Epilepsy Detection
    Lin, Zichao
    Gu, Zhenghui
    Li, Yinghao
    Yu, Zhuliang
    Li, Yuanqing
    2020 12TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2020, : 224 - 227
  • [44] Ensemble and Pre-Training Approach for Echo State Network and Extreme Learning Machine Models
    Tang, Lingyu
    Wang, Jun
    Wang, Mengyao
    Zhao, Chunyu
    ENTROPY, 2024, 26 (03)
  • [45] Label-efficient object detection via region proposal network pre-training
    Dong, Nanqing
    Ericsson, Linus
    Yang, Yongxin
    Leonardis, Ales
    Mcdonagh, Steven
    NEUROCOMPUTING, 2024, 577
  • [46] New Intent Discovery with Pre-training and Contrastive Learning
    Zhang, Yuwei
    Zhang, Haode
    Zhan, Li-Ming
    Wu, Xiao-Ming
    Lam, Albert Y. S.
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 256 - 269
  • [47] An Empirical Investigation of the Role of Pre-training in Lifelong Learning
    Mehta, Sanket Vaibhav
    Patil, Darshan
    Chandar, Sarath
    Strubell, Emma
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [48] Image Difference Captioning with Pre-training and Contrastive Learning
    Yao, Linli
    Wang, Weiying
    Jin, Qin
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 3108 - 3116
  • [49] Pre-training with Meta Learning for Chinese Word Segmentation
    Ke, Zhen
    Shi, Liang
    Sun, Songtao
    Meng, Erli
    Wang, Bin
    Qiu, Xipeng
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 5514 - 5523
  • [50] Effective network intrusion detection via representation learning: A Denoising AutoEncoder approach
    Lopes, Ivandro O.
    Zou, Deqing
    Abdulqadder, Ihsan H.
    Ruambo, Francis A.
    Yuan, Bin
    Jin, Hai
    COMPUTER COMMUNICATIONS, 2022, 194 : 55 - 65