Hard Negative Sampling via Regularized Optimal Transport for Contrastive Representation Learning

被引:1
|
作者
Jiang, Ruijie [1 ]
Ishwar, Prakash [2 ]
Aeron, Shuchin [1 ]
机构
[1] Tufts Univ, Dept ECE, Medford, MA 02155 USA
[2] Boston Univ, Dept ECE, Boston, MA USA
关键词
contrastive representation learning; hard negative sampling; optimal transport (OT);
D O I
10.1109/IJCNN54540.2023.10191650
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study the problem of designing hard negative sampling distributions for unsupervised contrastive representation learning. We propose and analyze a novel min-max framework that seeks a representation which minimizes the maximum (worst-case) generalized contrastive learning loss over all couplings (joint distributions between positive and negative samples subject to marginal constraints) and prove that the resulting min-max optimum representation will be degenerate. This provides the first theoretical justification for incorporating additional regularization constraints on the couplings. We re-interpret the min-max problem through the lens of Optimal Transport (OT) theory and utilize regularized transport couplings to control the degree of hardness of negative examples. Through experiments we demonstrate that the negative samples generated from our designed negative distribution are more similar to the anchor than those generated from the baseline negative distribution. We also demonstrate that entropic regularization yields negative sampling distributions with parametric form similar to that in a recent state-of-the-art negative sampling design and has similar performance in multiple datasets. Utilizing the uncovered connection with OT, we propose a new ground cost for designing the negative distribution and show improved performance of the learned representation on downstream tasks compared to the representation learned when using squared Euclidean cost.(1)
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Event representation via contrastive learning with prototype based hard negative sampling
    Kong, Jing
    Yang, Zhouwang
    NEUROCOMPUTING, 2024, 600
  • [2] Contrastive Speaker Representation Learning with Hard Negative Sampling for Speaker Recognition
    Go, Changhwan
    Lee, Young Han
    Kim, Taewoo
    Park, Nam In
    Chun, Chanjun
    SENSORS, 2024, 24 (19)
  • [3] Representation Learning via Adversarially-Contrastive Optimal Transport
    Cherian, Anoop
    Aeron, Shuchin
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [4] Representation Learning via Adversarially-Contrastive Optimal Transport
    Cherian, Anoop
    Aeron, Shuchin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [5] Hard Negative Sample Mining for Contrastive Representation in Reinforcement Learning
    Chen, Qihang
    Liang, Dayang
    Liu, Yunlong
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT II, 2022, 13281 : 277 - 288
  • [6] Hard Negative Mixing for Contrastive Learning
    Kalantidis, Yannis
    Sariyildiz, Mert Bulent
    Pion, Noe
    Weinzaepfel, Philippe
    Larlus, Diane
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [7] EMCRL: EM-Enhanced Negative Sampling Strategy for Contrastive Representation Learning
    Zhang, Kun
    Lv, Guangyi
    Wu, Le
    Hong, Richang
    Wang, Meng
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024,
  • [8] ReCoRe: Regularized Contrastive Representation Learning of World Model
    Poudel, Rudra P. K.
    Pandya, Harit
    Liwicki, Stephan
    Cipolla, Roberto
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 22904 - 22913
  • [9] Synthetic Hard Negative Samples for Contrastive Learning
    Dong, Hengkui
    Long, Xianzhong
    Li, Yun
    NEURAL PROCESSING LETTERS, 2024, 56 (01)
  • [10] Synthetic Hard Negative Samples for Contrastive Learning
    Hengkui Dong
    Xianzhong Long
    Yun Li
    Neural Processing Letters, 56