The Stacked Seq2seq-attention Model for Protocol Fuzzing

被引:0
|
作者
Gao, Zicong [1 ]
Dong, Weiyu [1 ]
Chang, Rui [1 ]
Ai, Chengwei [1 ]
机构
[1] State Key Lab Math Engn & Adv Comp, Cyberspace Secur Dept, Zhengzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Machine Learning; Security; Seq2seq Attention; Fuzzing;
D O I
10.1109/iccsnt47585.2019.8962499
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Fuzzing is an effective approach to discover vulnerabilities in software by generating the amount of unexcepted data as inputs to a program. It is difficult to fuzz the protocol automatically because it is necessary to manually construct a template that satisfies the protocol specification to generate test cases. In this paper, we establish stacked seq2seq-attention models to generate protocol test cases automatically. Seq2seq-attention is a machine learning technique which has an encoder-decoder structure to output text sequences based on context. We evaluate the training effect of seq2seq-attention models with different layers of LSTM and point out that the highest correctness of test cases is achieved by 3 layers LSTM. Besides, we implement a fuzzer based on stacked seq2seq attention model and compare with grammar-based fuzzer, which result indicates the test cases generated by our fuzzer discover more unique basic blocks.
引用
收藏
页码:126 / 130
页数:5
相关论文
共 50 条
  • [1] 基于实时电价特征的Seq2Seq-Attention网络短期电价预测
    厉宇程
    李长云
    湖南工业大学学报, 2020, 34 (04) : 29 - 34
  • [2] 基于改进Seq2Seq-Attention模型的文本摘要生成方法
    门鼎
    陈亮
    电子设计工程, 2022, (23) : 6 - 10
  • [3] Lane Change Trajectory Prediction of Vehicles in Highway Interweaving Area Using Seq2Seq-attention Network
    Han H.
    Xie T.
    Zhongguo Gonglu Xuebao/China Journal of Highway and Transport, 2020, 33 (06): : 106 - 118
  • [4] Seq2Seq-AFL: Fuzzing via sequence-to-sequence model
    Yang, Liqun
    Wei, Chaoren
    Yang, Jian
    Ma, Jinxin
    Guo, Hongcheng
    Cheng, Long
    Li, Zhoujun
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (10) : 4403 - 4421
  • [5] Automatic Generation of Pseudocode with Attention Seq2seq Model
    Xu, Shaofeng
    Xiong, Yun
    2018 25TH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE (APSEC 2018), 2018, : 711 - 712
  • [6] A Hierarchical Attention Seq2seq Model with CopyNet for Text Summarization
    Zhang, Yong
    Wang, Yuheng
    Liao, Jinzhi
    Xiao, Weidong
    2018 INTERNATIONAL CONFERENCE ON ROBOTS & INTELLIGENT SYSTEM (ICRIS 2018), 2018, : 316 - 320
  • [7] Network Penetration Intrusion Prediction Based on Attention Seq2seq Model
    Yu, Tianxiang
    Xin, Yang
    Zhu, Hongliang
    Tang, Qifeng
    Chen, Yuling
    SECURITY AND COMMUNICATION NETWORKS, 2022, 2022
  • [8] A Hierarchical Attention Based Seq2Seq Model for Chinese Lyrics Generation
    Fan, Haoshen
    Wang, Jie
    Zhuang, Bojin
    Wang, Shaojun
    Xiao, Jing
    PRICAI 2019: TRENDS IN ARTIFICIAL INTELLIGENCE, PT III, 2019, 11672 : 279 - 288
  • [9] Evaluating Performance of Conversational Bot Using Seq2Seq Model and Attention Mechanism
    Saluja, Karandeep
    Agrawal, Shashwat
    Kumar, Sanjeev
    Choudhury, Tanupriya
    EAI ENDORSED TRANSACTIONS ON SCALABLE INFORMATION SYSTEMS, 2024, 11 (06): : 1 - 11
  • [10] Seq2Seq model with attention for predicting nonlinear propagation of ultrafast pulses in optical fibers
    Zeng, Yuanhang
    Zhu, Guangzhi
    Zhu, Xiao
    OPTICS AND LASER TECHNOLOGY, 2025, 181