hERG-Att: Self-attention-based deep neural network for predicting hERG blockers

被引:31
|
作者
Kim, Hyunho [1 ]
Nam, Hojung [1 ]
机构
[1] Gwangju Inst Sci & Technol GIST, Sch Elect Engn & Comp Sci, Gwangju 61005, South Korea
基金
新加坡国家研究基金会;
关键词
hERG blockers prediction; Deep learning; Self-attention mechanism; POTASSIUM CHANNELS; DRUGS; CLASSIFICATION; PROLONGATION; POINTES;
D O I
10.1016/j.compbiolchem.2020.107286
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
A voltage-gated potassium channel encoded by the human ether h. go go related gene (hERG) regulates cardiac action potential, and it is involved in cardiotoxicity with compounds that inhibit its activity. Therefore, the screening of hERG channel blockers is a mandatory step in the drug discovery process. The screening of hERG blockers by using conventional methods is inefficient in terms of cost and efforts. This has led to the development of many in silico hERG blocker prediction models. However, constructing a high-performance predictive model with interpretability on hERG blockage by certain compounds is a major obstacle. In this study, we developed the first, attention-based, interpretable model that predicts hERG blockers and captures important hERG-related compound substructures. To do that, we first collected various datasets, ranging from public databases to publicly available private datasets, to train and test the model. Then, we developed a precise and interpretable hERG blocker prediction model by using deep learning with a self-attention approach that has an appropriate molecular descriptor, Morgan fingerprint. The proposed prediction model was validated, and the validation result showed that the model was well-optimized and had high performance. The test set performance of the proposed model was significantly higher than that of previous fingerprint-based conventional machine learning models. In particular, the proposed model generally had high accuracy and F1 score thereby, representing the model's predictive reliability. Furthermore, we interpreted the calculated attention score vectors obtained from the proposed prediction model and demonstrated the important structural patterns that are represented in hERG blockers. In summary, we have proposed a powerful and interpretable hERG blocker prediction model that can reduce the overall cost of drug discovery by accurately screening for hERG blockers and suggesting hERG-related substructures.
引用
收藏
页数:7
相关论文
共 50 条
  • [21] A Self-Attention-Based Deep Reinforcement Learning Approach for AGV Dispatching Systems
    Wei, Qinglai
    Yan, Yutian
    Zhang, Jie
    Xiao, Jun
    Wang, Cong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (06) : 7911 - 7922
  • [22] Mask-Based Neural Beamforming for Moving Speakers With Self-Attention-Based Tracking
    Ochiai, Tsubasa
    Delcroix, Marc
    Nakatani, Tomohiro
    Araki, Shoko
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 835 - 848
  • [23] A self-attention-based neural network for three-dimensional multivariate modeling and its skillful ENSO predictions
    Zhou, Lu
    Zhang, Rong-Hua
    SCIENCE ADVANCES, 2023, 9 (10)
  • [24] Deep & Attention : A Self-Attention based Neural Network for Remaining Useful Lifetime Predictions
    Li, Yuanjun
    Wang, Xingang
    2021 7TH INTERNATIONAL CONFERENCE ON MECHATRONICS AND ROBOTICS ENGINEERING (ICMRE 2021), 2021, : 98 - 105
  • [25] Self-attention-based Multi-block regression fusion Neural Network for quality-related process monitoring
    Sun, Jun
    Shi, Hongbo
    Zhu, Jiazhen
    Song, Bing
    Tao, Yang
    Tan, Shuai
    JOURNAL OF THE TAIWAN INSTITUTE OF CHEMICAL ENGINEERS, 2022, 133
  • [26] Self-attention based GRU neural network for deep knowledge tracing
    Jin, Shangzhu
    Zhao, Yan
    Peng, Jun
    Chen, Ning
    Xue, Run
    Liang, Minghui
    Jiang, Yunfeng
    2022 IEEE 17TH CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2022, : 1436 - 1440
  • [27] AttnPep: A Self-Attention-Based Deep Learning Method for Peptide Identification in Shotgun Proteomics
    Li, Yulin
    He, Qingzu
    Guo, Huan
    Shuai, Stella C.
    Cheng, Jinyan
    Liu, Liyu
    Shuai, Jianwei
    JOURNAL OF PROTEOME RESEARCH, 2024, 23 (02) : 834 - 843
  • [28] A Self-Attention-Based Deep Learning Model for Estimating Global Phytoplankton Pigment Profiles
    Yang, Yi
    Li, Xiaolong
    Li, Xiaofeng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [29] STCA-SNN: self-attention-based temporal-channel joint attention for spiking neural networks
    Wu, Xiyan
    Song, Yong
    Zhou, Ya
    Jiang, Yurong
    Bai, Yashuo
    Li, Xinyi
    Yang, Xin
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [30] Multiscale global and local self-attention-based network for remaining useful life prediction
    Zhang, Zhizheng
    Song, Wen
    Li, Qiqiang
    Gao, Hui
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2023, 34 (12)