Siamese Recurrent Neural Network with a Self-Attention Mechanism for Bioactivity Prediction

被引:21
|
作者
Fernandez-Llaneza, Daniel [1 ]
Ulander, Silas [1 ]
Gogishvili, Dea [1 ]
Nittinger, Eva [1 ]
Zhao, Hongtao [1 ]
Tyrchan, Christian [1 ]
机构
[1] AstraZeneca, Dept Med Chem Res & Early Dev, Resp & Immunol, Biopharmaceut R&D, SE-43183 Molndal, Sweden
来源
ACS OMEGA | 2021年 / 6卷 / 16期
关键词
DRUG DISCOVERY;
D O I
10.1021/acsomega.1c01266
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Activity prediction plays an essential role in drug discovery by directing search of drug candidates in the relevant chemical space. Despite being applied successfully to image recognition and semantic similarity, the Siamese neural network has rarely been explored in drug discovery where modelling faces challenges such as insufficient data and class imbalance. Here, we present a Siamese recurrent neural network model (SiameseCHEM) based on bidirectional long short-term memory architecture with a self-attention mechanism, which can automatically learn discriminative features from the SMILES representations of small molecules. Subsequently, it is used to categorize bioactivity of small molecules via N-shot learning. Trained on random SMILES strings, it proves robust across five different datasets for the task of binary or categorical classification of bioactivity. Benchmarking against two baseline machine learning models which use the chemistry-rich ECFP fingerprints as the input, the deep learning model outperforms on three datasets and achieves comparable performance on the other two. The failure of both baseline methods on SMILES strings highlights that the deep learning model may learn task-specific chemistry features encoded in SMILES strings.
引用
收藏
页码:11086 / 11094
页数:9
相关论文
共 50 条
  • [41] Dynamic Structured Neural Topic Model with Self-Attention Mechanism
    Miyamoto, Nozomu
    Isonuma, Masaru
    Takase, Sho
    Mori, Junichiro
    Sakata, Ichiro
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5916 - 5930
  • [42] A prediction model of student performance based on self-attention mechanism
    Yan Chen
    Ganglin Wei
    Jiaxin Liu
    Yunwei Chen
    Qinghua Zheng
    Feng Tian
    Haiping Zhu
    Qianying Wang
    Yaqiang Wu
    Knowledge and Information Systems, 2023, 65 : 733 - 758
  • [43] A prediction model of student performance based on self-attention mechanism
    Chen, Yan
    Wei, Ganglin
    Liu, Jiaxin
    Chen, Yunwei
    Zheng, Qinghua
    Tian, Feng
    Zhu, Haiping
    Wang, Qianying
    Wu, Yaqiang
    KNOWLEDGE AND INFORMATION SYSTEMS, 2023, 65 (02) : 733 - 758
  • [44] Incorporating edge convolution and correlative self-attention into graph neural network for material properties prediction
    Yang, Zexi
    Yu, Qi
    Zhan, Yapeng
    Liu, Jiying
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2025, 6 (01):
  • [45] Prediction of remaining useful life based on bidirectional gated recurrent unit with temporal self-attention mechanism
    Zhang, Jiusi
    Jiang, Yuchen
    Wu, Shimeng
    Li, Xiang
    Luo, Hao
    Yin, Shen
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2022, 221
  • [46] Original Music Generation using Recurrent Neural Networks with Self-Attention
    Jagannathan, Akash
    Chandrasekaran, Bharathi
    Dutta, Shubham
    Patil, Uma Rameshgouda
    Eirinaki, Magdalini
    2022 FOURTH IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE TESTING (AITEST 2022), 2022, : 56 - 63
  • [47] Neural network based on convolution and self-attention fusion mechanism for plant leaves disease recognition
    Zhao, Yun
    Li, Yang
    Wu, Na
    Xu, Xing
    CROP PROTECTION, 2024, 180
  • [48] Predicting the Feasibility of Copper(I)-Catalyzed Alkyne-Azide Cycloaddition Reactions Using a Recurrent Neural Network with a Self-Attention Mechanism
    Su, Shimin
    Yang, Yuyao
    Gan, Hanlin
    Zheng, Shuangjia
    Gu, Fenglong
    Zhao, Chao
    Xu, Jun
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2020, 60 (03) : 1165 - 1174
  • [49] Image Classification based on Self-attention Convolutional Neural Network
    Cai, Xiaohong
    Li, Ming
    Cao, Hui
    Ma, Jingang
    Wang, Xiaoyan
    Zhuang, Xuqiang
    SIXTH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION, 2021, 11913
  • [50] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Xue-Liang Leng
    Xiao-Ai Miao
    Tao Liu
    Multimedia Tools and Applications, 2021, 80 : 12581 - 12600