Contrastive Learning and Neural Oscillations

被引:28
|
作者
Baldi, Pierre [1 ,2 ]
Pineda, Fernando [3 ,4 ]
机构
[1] CALTECH, Jet Prop Lab, 4800 Oak Grove Dr, Pasadena, CA 91125 USA
[2] CALTECH, Div Biol, Pasadena, CA 91125 USA
[3] Johns Hopkins Univ, Appl Phys Lab, Baltimore, MD 21218 USA
[4] Johns Hopkins Univ, Dept Elect & Comp Engn, Baltimore, MD 21218 USA
关键词
D O I
10.1162/neco.1991.3.4.526
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The concept of Contrastive Learning (CL) is developed as a family of possible learning algorithms for neural networks. CL is an extension of Deterministic Boltzmann Machines to more general dynamical systems. During learning, the network oscillates between two phases. One phase has a teacher signal and one phase has no teacher signal. The weights are updated using a learning rule that corresponds to gradient descent on a contrast function that measures the discrepancy between the free network and the network with a teacher signal. The CL approach provides a general unified framework for developing new learning algorithms. It also shows that many different types of clamping and teacher signals are possible. Several examples are given and an analysis of the landscape of the contrast function is proposed with some relevant predictions for the CL curves. An approach that may be suitable for collective analog implementations is described. Simulation results and possible extensions are briefly discussed together with a new conjecture regarding the function of certain oscillations in the brain. In the appendix, we also examine two extensions of contrastive learning to time-dependent trajectories.
引用
收藏
页码:526 / 545
页数:20
相关论文
共 50 条
  • [41] Neighborhood contrastive learning-based graph neural network for bug triaging
    Dong, Haozhen
    Ren, Hongmin
    Shi, Jialiang
    Xie, Yichen
    Hu, Xudong
    SCIENCE OF COMPUTER PROGRAMMING, 2024, 235
  • [42] Molecular representation contrastive learning via transformer embedding to graph neural networks
    Liu, Yunwu
    Zhang, Ruisheng
    Li, Tongfeng
    Jiang, Jing
    Ma, Jun
    Yuan, Yongna
    Wang, Ping
    APPLIED SOFT COMPUTING, 2024, 164
  • [43] Reducing Word Omission Errors in Neural Machine Translation: A Contrastive Learning Approach
    Yang, Zonghan
    Cheng, Yong
    Liu, Yang
    Sun, Maosong
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 6191 - 6196
  • [44] Neural Oscillations and Behavioral Oscillations
    Amano, Kaoru
    VanRullen, Rufin
    I-PERCEPTION, 2019, 10 : 25 - 26
  • [45] Decoupled Contrastive Learning
    Yeh, Chun-Hsiao
    Hong, Cheng-Yao
    Hsu, Yen-Chi
    Liu, Tyng-Luh
    Chen, Yubei
    LeCun, Yann
    COMPUTER VISION, ECCV 2022, PT XXVI, 2022, 13686 : 668 - 684
  • [46] Geometric Contrastive Learning
    Koishekenov, Yeskendir
    Vadgama, Sharvaree
    Valperga, Riccardo
    Bekkers, Erik J.
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 206 - 215
  • [47] Supervised Contrastive Learning
    Khosla, Prannay
    Teterwak, Piotr
    Wang, Chen
    Sarna, Aaron
    Tian, Yonglong
    Isola, Phillip
    Maschinot, Aaron
    Liu, Ce
    Krishnan, Dilip
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [48] Contrastive Neural Ratio Estimation
    Miller, Benjamin Kurt
    Weniger, Christoph
    Forre, Patrick
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [49] CONTRASTIVE EXPLANATIONS IN NEURAL NETWORKS
    Prabhushankar, Mohit
    Kwon, Gukyeong
    Temel, Dogancan
    AlRegib, Ghassan
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 3289 - 3293
  • [50] Contrastive embeddings for neural architectures
    Hesslow, Daniel
    Poli, Iacopo
    arXiv, 2021,