PolyCL: contrastive learning for polymer representation learning via explicit and implicit augmentations

被引:0
|
作者
Zhou, Jiajun [1 ]
Yang, Yijie [1 ]
Mroz, Austin M. [1 ,2 ]
Jelfs, Kim E. [1 ]
机构
[1] Imperial Coll London, Dept Chem, Mol Sci Res Hub, White City Campus,Wood Lane, London W12 0BZ, England
[2] Imperial Coll London, Ctr AI Sci 1 10, White City Campus,Wood Lane, London W12 0BZ, England
来源
DIGITAL DISCOVERY | 2025年 / 4卷 / 01期
基金
英国工程与自然科学研究理事会; 欧洲研究理事会; 欧盟地平线“2020”;
关键词
D O I
10.1039/d4dd00236a
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Polymers play a crucial role in a wide array of applications due to their diverse and tunable properties. Establishing the relationship between polymer representations and their properties is crucial to the computational design and screening of potential polymers via machine learning. The quality of the representation significantly influences the effectiveness of these computational methods. Here, we present a self-supervised contrastive learning paradigm, PolyCL, for learning robust and high-quality polymer representation without the need for labels. Our model combines explicit and implicit augmentation strategies for improved learning performance. The results demonstrate that our model achieves either better, or highly competitive, performances on transfer learning tasks as a feature extractor without an overcomplicated training strategy or hyperparameter optimisation. Further enhancing the efficacy of our model, we conducted extensive analyses on various augmentation combinations used in contrastive learning. This led to identifying the most effective combination to maximise PolyCL's performance.
引用
收藏
页码:149 / 160
页数:12
相关论文
共 50 条
  • [1] Graph contrastive learning with implicit augmentations
    Liang, Huidong
    Du, Xingjian
    Zhu, Bilei
    Ma, Zejun
    Chen, Ke
    Gao, Junbin
    NEURAL NETWORKS, 2023, 163 : 156 - 164
  • [2] Analysis of Augmentations for Contrastive ECG Representation Learning
    Soltanieh, Sahar
    Etemad, Ali
    Hashemi, Javad
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [3] Graph Contrastive Learning with Augmentations
    You, Yuning
    Chen, Tianlong
    Sui, Yongduo
    Chen, Ting
    Wang, Zhangyang
    Shen, Yang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [4] Contrastive Learning With Stronger Augmentations
    Wang, Xiao
    Qi, Guo-Jun
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (05) : 5549 - 5560
  • [5] Augmentations in Hypergraph Contrastive Learning: Fabricated and Generative
    Wei, Tianxin
    You, Yuning
    Chen, Tianlong
    Shen, Yang
    He, Jingrui
    Wang, Zhangyang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [6] Neural Vector Fields: Implicit Representation by Explicit Learning
    Yang, Xianghui
    Lin, Guosheng
    Chen, Zhenghao
    Zhou, Luping
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 16727 - 16738
  • [7] Concept Representation by Learning Explicit and Implicit Concept Couplings
    Lu, Wenpeng
    Zhang, Yuteng
    Wang, Shoujin
    Huang, Heyan
    Liu, Qian
    Luo, Sheng
    IEEE INTELLIGENT SYSTEMS, 2021, 36 (01) : 6 - 15
  • [8] Boosting Patient Representation Learning via Graph Contrastive Learning
    Zhang, Zhenhao
    Liu, Yuxi
    Bian, Jiang
    Yepes, Antonio Jimeno
    Shen, Jun
    Li, Fuyi
    Long, Guodong
    Salim, Flora D.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-APPLIED DATA SCIENCE TRACK, PT IX, ECML PKDD 2024, 2024, 14949 : 335 - 350
  • [9] PLURAL: 3D Point Cloud Transfer Learning via Contrastive Learning With Augmentations
    Biehler, Michael
    Sun, Yiqi
    Kode, Shriyanshu
    Li, Jing
    Shi, Jianjun
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2024, 21 (04) : 7550 - 7561
  • [10] Implicit Contrastive Representation Learning with Guided Stop-gradient
    Lee, Byeongchan
    Lee, Sehyun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,