hERG-Att: Self-attention-based deep neural network for predicting hERG blockers

被引:31
|
作者
Kim, Hyunho [1 ]
Nam, Hojung [1 ]
机构
[1] Gwangju Inst Sci & Technol GIST, Sch Elect Engn & Comp Sci, Gwangju 61005, South Korea
基金
新加坡国家研究基金会;
关键词
hERG blockers prediction; Deep learning; Self-attention mechanism; POTASSIUM CHANNELS; DRUGS; CLASSIFICATION; PROLONGATION; POINTES;
D O I
10.1016/j.compbiolchem.2020.107286
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
A voltage-gated potassium channel encoded by the human ether h. go go related gene (hERG) regulates cardiac action potential, and it is involved in cardiotoxicity with compounds that inhibit its activity. Therefore, the screening of hERG channel blockers is a mandatory step in the drug discovery process. The screening of hERG blockers by using conventional methods is inefficient in terms of cost and efforts. This has led to the development of many in silico hERG blocker prediction models. However, constructing a high-performance predictive model with interpretability on hERG blockage by certain compounds is a major obstacle. In this study, we developed the first, attention-based, interpretable model that predicts hERG blockers and captures important hERG-related compound substructures. To do that, we first collected various datasets, ranging from public databases to publicly available private datasets, to train and test the model. Then, we developed a precise and interpretable hERG blocker prediction model by using deep learning with a self-attention approach that has an appropriate molecular descriptor, Morgan fingerprint. The proposed prediction model was validated, and the validation result showed that the model was well-optimized and had high performance. The test set performance of the proposed model was significantly higher than that of previous fingerprint-based conventional machine learning models. In particular, the proposed model generally had high accuracy and F1 score thereby, representing the model's predictive reliability. Furthermore, we interpreted the calculated attention score vectors obtained from the proposed prediction model and demonstrated the important structural patterns that are represented in hERG blockers. In summary, we have proposed a powerful and interpretable hERG blocker prediction model that can reduce the overall cost of drug discovery by accurately screening for hERG blockers and suggesting hERG-related substructures.
引用
收藏
页数:7
相关论文
共 50 条
  • [41] A diagonal masking self-attention-based multi-scale network for motor imagery classification
    Yang, Kaijun
    Wang, Jihong
    Yang, Liantao
    Bian, Lifeng
    Luo, Zijiang
    Yang, Chen
    JOURNAL OF NEURAL ENGINEERING, 2024, 21 (03)
  • [42] Temporal self-attention-based Conv-LSTM network for multivariate time series prediction
    Fu, En
    Zhang, Yinong
    Yang, Fan
    Wang, Shuying
    NEUROCOMPUTING, 2022, 501 : 162 - 173
  • [43] Multi-Type Self-Attention-Based Convolutional-Neural-Network Post-Filtering for AV1 Codec
    Gwun, Woowoen
    Choi, Kiho
    Park, Gwang Hoon
    MATHEMATICS, 2024, 12 (18)
  • [44] Predicting Electricity Usage Based on Deep Neural Network
    Wei, Ran
    Gan, Qirui
    Wang, Huiquan
    Wang, Jinhai
    Dang, Xin
    2019 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND VIRTUAL ENVIRONMENTS FOR MEASUREMENT SYSTEMS AND APPLICATIONS (CIVEMSA 2019), 2019, : 95 - 100
  • [45] Energy trading optimisation of microgrids considering degradation: a self-attention-based deep reinforcement learning method
    Xu, Bin
    Zhuang, Zhichao
    Bian, Honggen
    Wen, Qing
    Li, Chengyang
    Qi, Jin
    INTERNATIONAL JOURNAL OF CONTROL, 2025,
  • [46] A Deep Learning Approach for Classifying Vulnerability Descriptions Using Self Attention Based Neural Network
    P. R. Vishnu
    P. Vinod
    Suleiman Y. Yerima
    Journal of Network and Systems Management, 2022, 30
  • [47] A Deep Learning Approach for Classifying Vulnerability Descriptions Using Self Attention Based Neural Network
    Vishnu, P. R.
    Vinod, P.
    Yerima, Suleiman Y.
    JOURNAL OF NETWORK AND SYSTEMS MANAGEMENT, 2022, 30 (01)
  • [48] Multi-Head Self-Attention-Based Deep Clustering for Single-Channel Speech Separation
    Jin, Yanliang
    Tang, Chenjun
    Liu, Qianhong
    Wang, Yan
    IEEE ACCESS, 2020, 8 : 100013 - 100021
  • [49] DCSaNet: Dilated Convolution and Self-Attention-Based Neural Network for Channel Estimation in IRS-Aided Multi-User Communication System
    Li, Tingting
    Yang, Yang
    Lee, Jemin
    Qin, Xiaoqi
    Huang, Jingfei
    He, Gang
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2023, 12 (07) : 1139 - 1143
  • [50] SACANet: end-to-end self-attention-based network for 3D clothing animation
    Chen, Yunxi
    Cao, Yuanjie
    Fang, Fei
    Huang, Jin
    Hu, Xinrong
    He, Ruhan
    Zhang, Junjie
    VISUAL COMPUTER, 2024, : 3829 - 3842