Reachability Analysis of Deep ReLU Neural Networks using Facet-Vertex Incidence

被引:9
|
作者
Yang, Xiaodong [1 ]
Johnson, Taylor T. [1 ]
Hoang-Dung Tran [2 ]
Yamaguchi, Tomoya [3 ]
Hoxha, Bardh [3 ]
Prokhorov, Danil [3 ]
机构
[1] Vanderbilt Univ, 221 Kirkland Hall, Nashville, TN 37235 USA
[2] Univ Nebraska, Lincoln, NE USA
[3] Toyota Res Inst, Ann Arbor, MI USA
关键词
D O I
10.1145/3447928.3456650
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep Neural Networks (DNNs) are powerful machine learning models for approximating complex functions. In this work, we provide an exact reachability analysis method for DNNs with Rectified Linear Unit (ReLU) activation functions. At its core, our set-based method utilizes a facet-vertex incidence matrix, which represents a complete encoding of the combinatorial structure of convex sets. When a safety violation is detected, our approach provides backtracking which determines the complete input set that caused the safety violation. The performance of our method is evaluated and compared to other state-of-the-art methods by using the ACAS Xu flight controller and other benchmarks.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Reachability Analysis of Deep Neural Networks with Provable Guarantees
    Ruan, Wenjie
    Huang, Xiaowei
    Kwiatkowska, Marta
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2651 - 2659
  • [2] Approximation in LP(μ) with deep ReLU neural networks
    Voigtlaender, Felix
    Petersen, Philipp
    2019 13TH INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2019,
  • [3] NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION
    Schmidt-Hieber, Johannes
    ANNALS OF STATISTICS, 2020, 48 (04): : 1875 - 1897
  • [4] Model-Agnostic Reachability Analysis on Deep Neural Networks
    Zhang, Chi
    Ruan, Wenjie
    Wang, Fu
    Xu, Peipei
    Min, Geyong
    Huang, Xiaowei
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2023, PT I, 2023, 13935 : 341 - 354
  • [5] Unboundedness of Linear Regions of Deep ReLU Neural Networks
    Ponomarchuk, Anton
    Koutschan, Christoph
    Moser, Bernhard
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2022 WORKSHOPS, 2022, 1633 : 3 - 10
  • [6] A generative model for fBm with deep ReLU neural networks
    Allouche, Michaël
    Girard, Stéphane
    Gobet, Emmanuel
    Journal of Complexity, 2022, 73
  • [7] On a Fitting of a Heaviside Function by Deep ReLU Neural Networks
    Hagiwara, Katsuyuki
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT I, 2018, 11301 : 59 - 69
  • [8] A generative model for fBm with deep ReLU neural networks
    Allouche, Michael
    Girard, Stephane
    Gobet, Emmanuel
    JOURNAL OF COMPLEXITY, 2022, 73
  • [9] Optimal approximation of piecewise smooth functions using deep ReLU neural networks
    Petersen, Philipp
    Voigtlaender, Felix
    NEURAL NETWORKS, 2018, 108 : 296 - 330
  • [10] DISCUSSION OF: "NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION"
    Shamir, Ohad
    ANNALS OF STATISTICS, 2020, 48 (04): : 1911 - 1915