hERG-Att: Self-attention-based deep neural network for predicting hERG blockers

被引:31
|
作者
Kim, Hyunho [1 ]
Nam, Hojung [1 ]
机构
[1] Gwangju Inst Sci & Technol GIST, Sch Elect Engn & Comp Sci, Gwangju 61005, South Korea
基金
新加坡国家研究基金会;
关键词
hERG blockers prediction; Deep learning; Self-attention mechanism; POTASSIUM CHANNELS; DRUGS; CLASSIFICATION; PROLONGATION; POINTES;
D O I
10.1016/j.compbiolchem.2020.107286
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
A voltage-gated potassium channel encoded by the human ether h. go go related gene (hERG) regulates cardiac action potential, and it is involved in cardiotoxicity with compounds that inhibit its activity. Therefore, the screening of hERG channel blockers is a mandatory step in the drug discovery process. The screening of hERG blockers by using conventional methods is inefficient in terms of cost and efforts. This has led to the development of many in silico hERG blocker prediction models. However, constructing a high-performance predictive model with interpretability on hERG blockage by certain compounds is a major obstacle. In this study, we developed the first, attention-based, interpretable model that predicts hERG blockers and captures important hERG-related compound substructures. To do that, we first collected various datasets, ranging from public databases to publicly available private datasets, to train and test the model. Then, we developed a precise and interpretable hERG blocker prediction model by using deep learning with a self-attention approach that has an appropriate molecular descriptor, Morgan fingerprint. The proposed prediction model was validated, and the validation result showed that the model was well-optimized and had high performance. The test set performance of the proposed model was significantly higher than that of previous fingerprint-based conventional machine learning models. In particular, the proposed model generally had high accuracy and F1 score thereby, representing the model's predictive reliability. Furthermore, we interpreted the calculated attention score vectors obtained from the proposed prediction model and demonstrated the important structural patterns that are represented in hERG blockers. In summary, we have proposed a powerful and interpretable hERG blocker prediction model that can reduce the overall cost of drug discovery by accurately screening for hERG blockers and suggesting hERG-related substructures.
引用
收藏
页数:7
相关论文
共 50 条
  • [31] EAML: ensemble self-attention-based mutual learning network for document image classification
    Bakkali, Souhail
    Ming, Zuheng
    Coustaty, Mickael
    Rusinol, Marcal
    INTERNATIONAL JOURNAL ON DOCUMENT ANALYSIS AND RECOGNITION, 2021, 24 (03) : 251 - 268
  • [32] Self-Attention-Based Deep Convolution LSTM Framework for Sensor-Based Badminton Activity Recognition
    Deng, Jingyang
    Zhang, Shuyi
    Ma, Jinwen
    SENSORS, 2023, 23 (20)
  • [33] EAML: ensemble self-attention-based mutual learning network for document image classification
    Souhail Bakkali
    Zuheng Ming
    Mickaël Coustaty
    Marçal Rusiñol
    International Journal on Document Analysis and Recognition (IJDAR), 2021, 24 : 251 - 268
  • [34] Pretreatment-free SERS sensing of microplastics using a self-attention-based neural network on hierarchically porous Ag foams
    Guselnikova, Olga
    Trelin, Andrii
    Kang, Yunqing
    Postnikov, Pavel
    Kobashi, Makoto
    Suzuki, Asuka
    Shrestha, Lok Kumar
    Henzie, Joel
    Yamauchi, Yusuke
    NATURE COMMUNICATIONS, 2024, 15 (01)
  • [35] SAGL: A self-attention-based graph learning framework for predicting survival of colorectal cancer patients
    Yang, Ping
    Qiu, Hang
    Yang, Xulin
    Wang, Liya
    Wang, Xiaodong
    COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2024, 249
  • [36] Self-attention-based convolutional neural network and time-frequency common spatial pattern for enhanced motor imagery classification
    Zhang, Rui
    Liu, Guoyang
    Wen, Yiming
    Zhou, Weidong
    JOURNAL OF NEUROSCIENCE METHODS, 2023, 398
  • [37] A Self-Attention-Based Multi-Level Fusion Network for Aspect Category Sentiment Analysis
    Tian, Dong
    Shi, Jia
    Feng, Jianying
    COGNITIVE COMPUTATION, 2023, 15 (04) : 1372 - 1390
  • [38] Self-Attention-based Multi-Scale Feature Fusion Network for Road Ponding Segmentation
    Yang, Shangyu
    Zhang, Ronghui
    Sun, Wencai
    Chen, Shengru
    Ye, Cong
    Wu, Hao
    Li, Mengran
    2024 2ND ASIA CONFERENCE ON COMPUTER VISION, IMAGE PROCESSING AND PATTERN RECOGNITION, CVIPPR 2024, 2024,
  • [39] Self-Attention Based Neural Network for Predicting RNA-Protein Binding Sites
    Wang, Xinyi
    Zhang, Mingyang
    Long, Chunlin
    Yao, Lin
    Zhu, Min
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (02) : 1469 - 1479
  • [40] A Self-Attention-Based Multi-Level Fusion Network for Aspect Category Sentiment Analysis
    Dong Tian
    Jia Shi
    Jianying Feng
    Cognitive Computation, 2023, 15 : 1372 - 1390