Mappings, dimensionality and reversing out of deep neural networks

被引:2
|
作者
Cui, Zhaofang
Grindrod, Peter
机构
关键词
Degrees of freedom (mechanics) - Embeddings - Multilayer neural networks;
D O I
10.1093/imamat/hxad019
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We consider a large cloud of vectors formed at each layer of a standard neural network, corresponding to a large number of separate inputs which were presented independently to the classifier. Although the embedding dimension (the total possible degrees of freedom) reduces as we pass through successive layers, from input to output, the actual dimensionality of the point clouds that the layers contain does not necessarily reduce. We argue that this phenomenon may result in a vulnerability to (universal) adversarial attacks (which are small specific perturbations). This analysis requires us to estimate the intrinsic dimension of point clouds (with values between 20 and 200) within embedding spaces of dimension 1000 up to 800,000. This needs some care. If the cloud dimension actually increases from one layer to the next it implies there is some 'volume filling' over-folding, and thus there exist possible small directional perturbations in the latter space that are equivalent to shifting large distances within the former space, thus inviting possibility of universal and imperceptible attacks.
引用
收藏
页码:2 / 11
页数:10
相关论文
共 50 条
  • [41] Reducing the Dimensionality of SPD Matrices with Neural Networks in BCI
    Peng, Zhen
    Li, Hongyi
    Zhao, Di
    Pan, Chengwei
    MATHEMATICS, 2023, 11 (07)
  • [42] Dimensionality Reduction technique using Neural Networks - A Survey
    Mantri, Shamla
    Tarale, Nikhil S.
    Mahajan, Sudip C.
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2011, 2 (04) : 40 - 43
  • [43] A Normative Theory of Adaptive Dimensionality Reduction in Neural Networks
    Pehlevan, Cengiz
    Chklovskii, Dmitri B.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [44] Uniforming the dimensionality of data with neural networks for materials informatics
    Ohno, Hiroshi
    APPLIED SOFT COMPUTING, 2016, 46 : 17 - 25
  • [45] ANALYSIS OF THE DIMENSIONALITY OF NEURAL NETWORKS FOR PATTERN-RECOGNITION
    FU, LM
    PATTERN RECOGNITION, 1990, 23 (10) : 1131 - 1140
  • [46] Neural Networks for Incremental Dimensionality Reduced Reinforcement Learning
    Curran, William
    Pocius, Rey
    Smart, William D.
    2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 1559 - 1565
  • [47] Acquisition of Viewpoint Transformation and Action Mappings via Sequence to Sequence Imitative Learning by Deep Neural Networks
    Nakajo, Ryoichi
    Murata, Shingo
    Arie, Hiroaki
    Ogata, Tetsuya
    FRONTIERS IN NEUROROBOTICS, 2018, 12 : 1 - 14
  • [48] Scalable out-of-sample extension of graph embeddings using deep neural networks
    Jansen, Aren
    Sell, Gregory
    Lyzinski, Vince
    PATTERN RECOGNITION LETTERS, 2017, 94 : 1 - 6
  • [49] Predicting Out-of-Distribution Performance of Deep Neural Networks Using Model Conformance
    Kaur, Ramneet
    Jha, Susmit
    Roy, Anirban
    Sokolsky, Oleg
    Lee, Insup
    2023 IEEE INTERNATIONAL CONFERENCE ON ASSURED AUTONOMY, ICAA, 2023, : 19 - 28
  • [50] Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition
    Gruene, Lars
    IFAC PAPERSONLINE, 2021, 54 (09): : 317 - 322