Mappings, dimensionality and reversing out of deep neural networks

被引:2
|
作者
Cui, Zhaofang
Grindrod, Peter
机构
关键词
Degrees of freedom (mechanics) - Embeddings - Multilayer neural networks;
D O I
10.1093/imamat/hxad019
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We consider a large cloud of vectors formed at each layer of a standard neural network, corresponding to a large number of separate inputs which were presented independently to the classifier. Although the embedding dimension (the total possible degrees of freedom) reduces as we pass through successive layers, from input to output, the actual dimensionality of the point clouds that the layers contain does not necessarily reduce. We argue that this phenomenon may result in a vulnerability to (universal) adversarial attacks (which are small specific perturbations). This analysis requires us to estimate the intrinsic dimension of point clouds (with values between 20 and 200) within embedding spaces of dimension 1000 up to 800,000. This needs some care. If the cloud dimension actually increases from one layer to the next it implies there is some 'volume filling' over-folding, and thus there exist possible small directional perturbations in the latter space that are equivalent to shifting large distances within the former space, thus inviting possibility of universal and imperceptible attacks.
引用
收藏
页码:2 / 11
页数:10
相关论文
共 50 条
  • [1] Mechanisms of dimensionality reduction and decorrelation in deep neural networks
    Huang, Haiping
    PHYSICAL REVIEW E, 2018, 98 (06)
  • [2] Robust dimensionality reduction for data visualization with deep neural networks
    Becker, Martin
    Lippel, Jens
    Stuhlsatz, Andre
    Zielke, Thomas
    GRAPHICAL MODELS, 2020, 108
  • [3] Estimation of a Function of Low Local Dimensionality by Deep Neural Networks
    Kohler, Michael
    Krzyzak, Adam
    Langer, Sophie
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (06) : 4032 - 4042
  • [4] Deep Convolutional Neural Networks with Merge-and-Run Mappings
    Zhao, Liming
    Li, Mingjie
    Meng, Depu
    Li, Xi
    Zhang, Zhaoxiang
    Zhuang, Yueting
    Tu, Zhuowen
    Wang, Jingdong
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 3170 - 3176
  • [5] Dimensionality-Induced Information Loss of Outliers in Deep Neural Networks
    Uematsu, Kazuki
    Haruki, Kosuke
    Suzuki, Taiji
    Kimura, Mitsuhiro
    Takimoto, Takahiro
    Nakagawa, Hideyuki
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT I, ECML PKDD 2024, 2024, 14941 : 144 - 160
  • [6] On Detection of Out of Distribution Inputs in Deep Neural Networks
    Jha, Susmit
    Roy, Anirban
    2021 IEEE THIRD INTERNATIONAL CONFERENCE ON COGNITIVE MACHINE INTELLIGENCE (COGMI 2021), 2021, : 282 - 288
  • [7] Neural networks for dimensionality reduction
    Pal, NR
    Kumar, EV
    PROGRESS IN CONNECTIONIST-BASED INFORMATION SYSTEMS, VOLS 1 AND 2, 1998, : 221 - 224
  • [8] Deep ReLU neural networks overcome the curse of dimensionality for partial integrodifferential equations
    Gonon, Lukas
    Schwab, Christoph
    ANALYSIS AND APPLICATIONS, 2023, 21 (01) : 1 - 47
  • [9] Reducing the dimensionality of data with neural networks
    Hinton, G. E.
    Salakhutdinov, R. R.
    SCIENCE, 2006, 313 (5786) : 504 - 507
  • [10] Identity Mappings in Deep Residual Networks
    He, Kaiming
    Zhang, Xiangyu
    Ren, Shaoqing
    Sun, Jian
    COMPUTER VISION - ECCV 2016, PT IV, 2016, 9908 : 630 - 645