Contrastive Learning and Neural Oscillations

被引:28
|
作者
Baldi, Pierre [1 ,2 ]
Pineda, Fernando [3 ,4 ]
机构
[1] CALTECH, Jet Prop Lab, 4800 Oak Grove Dr, Pasadena, CA 91125 USA
[2] CALTECH, Div Biol, Pasadena, CA 91125 USA
[3] Johns Hopkins Univ, Appl Phys Lab, Baltimore, MD 21218 USA
[4] Johns Hopkins Univ, Dept Elect & Comp Engn, Baltimore, MD 21218 USA
关键词
D O I
10.1162/neco.1991.3.4.526
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The concept of Contrastive Learning (CL) is developed as a family of possible learning algorithms for neural networks. CL is an extension of Deterministic Boltzmann Machines to more general dynamical systems. During learning, the network oscillates between two phases. One phase has a teacher signal and one phase has no teacher signal. The weights are updated using a learning rule that corresponds to gradient descent on a contrast function that measures the discrepancy between the free network and the network with a teacher signal. The CL approach provides a general unified framework for developing new learning algorithms. It also shows that many different types of clamping and teacher signals are possible. Several examples are given and an analysis of the landscape of the contrast function is proposed with some relevant predictions for the CL curves. An approach that may be suitable for collective analog implementations is described. Simulation results and possible extensions are briefly discussed together with a new conjecture regarding the function of certain oscillations in the brain. In the appendix, we also examine two extensions of contrastive learning to time-dependent trajectories.
引用
收藏
页码:526 / 545
页数:20
相关论文
共 50 条
  • [31] Oscillations, neural computations and learning during wake and sleep
    Penagos, Hector
    Varela, Carmen
    Wilson, Matthew A.
    CURRENT OPINION IN NEUROBIOLOGY, 2017, 44 : 193 - 201
  • [32] Contrastive Dual Gating: Learning Sparse Features With Contrastive Learning
    Meng, Jian
    Yang, Li
    Shin, Jinwoo
    Fan, Deliang
    Seo, Jae-Sun
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 12247 - 12255
  • [33] Disentangled Relational Graph Neural Network with Contrastive Learning for knowledge graph completion
    Yin, Hong
    Zhong, Jiang
    Li, Rongzhen
    Li, Xue
    KNOWLEDGE-BASED SYSTEMS, 2024, 295
  • [34] Contrastive meta-reinforcement learning for heterogeneous graph neural architecture search
    Xu, Zixuan
    Wu, Jia
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 260
  • [35] Hyperbolic graph convolutional neural network with contrastive learning for automated ICD coding
    Wu, Yuzhou
    Chen, Xuechen
    Yao, Xin
    Yu, Yongang
    Chen, Zhigang
    COMPUTERS IN BIOLOGY AND MEDICINE, 2024, 168
  • [36] Multi-channel Graph Neural Networks with Contrastive Learning for Social Recommendation
    Liu, Ping
    Yang, Jian
    2023 IEEE INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY, WI-IAT, 2023, : 32 - 39
  • [37] Globally Enhanced Heterogeneous Temporal Graph Neural Networks Based on Contrastive Learning
    Jiao P.
    Liu H.
    Lü L.
    Gao M.
    Zhang J.
    Liu D.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2023, 60 (08): : 1808 - 1821
  • [38] Contrastive learning of protein representations with graph neural networks for structural and functional annotations
    Luo, Jiaqi
    Luo, Yunan
    BIOCOMPUTING 2023, PSB 2023, 2023, : 109 - 120
  • [39] Task-Adaptive Neural Network Search with Meta-Contrastive Learning
    Jeong, Wonyong
    Lee, Hayeon
    Park, Geon
    Hyung, Eunyoung
    Baek, Jinheon
    Hwang, Sung Ju
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [40] CSGNN: Improving Graph Neural Networks with Contrastive Semi-supervised Learning
    Song, Yumeng
    Gu, Yu
    Li, Xiaohua
    Li, Chuanwen
    Yu, Ge
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT I, 2022, : 731 - 738