TAG: A Neural Network Model for Large-Scale Optical Implementation

被引:1
|
作者
Lee, Hyuek-Jae [1 ]
Lee, Soo-Young [1 ]
Shin, Sang-Yung [1 ]
Koh, Bo-Yun [2 ]
机构
[1] Korea Adv Inst Sci & Technol, Dept Elect Engn, POB 150, Seoul, South Korea
[2] Agcy Def Dev, Daejon, South Korea
关键词
D O I
10.1162/neco.1991.3.1.135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
TAG (Training by Adaptive Gain) is a new adaptive learning algorithm developed for optical implementation of large-scale artificial neural networks. For fully interconnected single-layer neural networks with. N input and M output neurons TAG contains two different types of interconnections, i.e., MN global fixed interconnections and N +. M adaptive gain controls. For two-dimensional input patterns the former may be achieved by multifacet holograms, and the latter by spatial light modulators (SLMs). For the same number of input and output neurons TAG requires much less adaptive elements, and provides a possibility for large-scale optical implementation at some sacrifice in performance as compared to the perceptron. The training algorithm is based on gradient descent and error backpropagation, and is easily extensible to multilayer architecture. Computer simulation demonstrates reasonable performance of TAG compared to perceptron performance. An electrooptical implementation of TAG is also proposed.
引用
收藏
页码:135 / 143
页数:9
相关论文
共 50 条
  • [31] Dynamic programming neural network for large-scale optimization problems
    Hou, Zengguang
    Wu, Cangpu
    Zidonghua Xuebao/Acta Automatica Sinica, 1999, 25 (01): : 45 - 51
  • [32] Reinforcement learning in a large-scale photonic recurrent neural network
    Bueno, J.
    Maktoobi, S.
    Froehly, L.
    Fischer, I.
    Jacquot, M.
    Larger, L.
    Brunner, D.
    OPTICA, 2018, 5 (06): : 756 - 760
  • [33] XGCN: a library for large-scale graph neural network recommendations
    Xiran Song
    Hong Huang
    Jianxun Lian
    Hai Jin
    Frontiers of Computer Science, 2024, 18
  • [35] Large-scale PACS implementation
    John A. Carrino
    Paul J. Unkel
    Ira D. Miller
    Cindy L. Bowser
    Michael W. Freckleton
    Thomas G. Johnson
    Journal of Digital Imaging, 1998, 11 : 3 - 7
  • [36] Technique for Large-Scale Antenna Beamforming Based on Neural Network
    Xie Chaochen
    Guo Rujing
    Zhao Jianzhou
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2022, 2022
  • [37] Hypersparse Neural Network Analysis of Large-Scale Internet Traffic
    Kepner, Jeremy
    Cho, Kenjiro
    Claffy, K. C.
    Gadepally, Vijay
    Michaleas, Peter
    Milechin, Lauren
    2019 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2019,
  • [38] A MODULAR RING ARCHITECTURE FOR LARGE-SCALE NEURAL NETWORK IMPLEMENTATIONS
    JUMP, LB
    LIGOMENIDES, PA
    VISUAL COMMUNICATIONS AND IMAGE PROCESSING IV, PTS 1-3, 1989, 1199 : 1127 - 1136
  • [39] Parallel Large-Scale Neural Network Training For Online Advertising
    Qi, Quanchang
    Lu, Guangming
    Zhang, Jun
    Yang, Lichun
    Liu, Haishan
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 343 - 350
  • [40] A neural network for hierarchical optimization of nonlinear large-scale systems
    Hou, ZG
    Wu, CP
    Bao, P
    INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 1998, 29 (02) : 159 - 166