PepNet: an interpretable neural network for anti-inflammatory and antimicrobial peptides prediction using a pre-trained protein language model

被引:3
|
作者
Han, Jiyun [1 ]
Kong, Tongxin [1 ]
Liu, Juntao [1 ]
机构
[1] Shandong Univ, Sch Math & Stat, Weihai 264209, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
CYTOKINES IL-4; DATABASE;
D O I
10.1038/s42003-024-06911-1
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Identifying anti-inflammatory peptides (AIPs) and antimicrobial peptides (AMPs) is crucial for the discovery of innovative and effective peptide-based therapies targeting inflammation and microbial infections. However, accurate identification of AIPs and AMPs remains a computational challenge mainly due to limited utilization of peptide sequence information. Here, we propose PepNet, an interpretable neural network for predicting both AIPs and AMPs by applying a pre-trained protein language model to fully utilize the peptide sequence information. It first captures the information of residue arrangements and physicochemical properties using a residual dilated convolution block, and then seizes the function-related diverse information by introducing a residual Transformer block to characterize the residue representations generated by a pre-trained protein language model. After training and testing, PepNet demonstrates great superiority over other leading AIP and AMP predictors and shows strong interpretability of its learned peptide representations. A user-friendly web server for PepNet is freely available at http://liulab.top/PepNet/server. PepNet achieves accurate identification of both AIPs and AMPs by developing an interpretable neural network and applying a pre-trained protein language model.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Self-supervised pre-trained neural network for quantum natural language processing
    Yao, Ben
    Tiwari, Prayag
    Li, Qiuchi
    NEURAL NETWORKS, 2025, 184
  • [22] Protein-small molecule binding site prediction based on a pre-trained protein language model with contrastive learning
    Wang, Jue
    Liu, Yufan
    Tian, Boxue
    JOURNAL OF CHEMINFORMATICS, 2024, 16 (01):
  • [23] LMNglyPred: prediction of human N-linked glycosylation sites using embeddings from a pre-trained protein language model
    Pakhrin, Subash C.
    Pokharel, Suresh
    Aoki-Kinoshita, Kiyoko F.
    Beck, Moriah R.
    Dam, Tarun K.
    Caragea, Doina
    Kc, Dukka B.
    GLYCOBIOLOGY, 2023, 33 (05) : 411 - 422
  • [24] English-Assamese neural machine translation using prior alignment and pre-trained language model
    Laskar, Sahinur Rahman
    Paul, Bishwaraj
    Dadure, Pankaj
    Manna, Riyanka
    Pakray, Partha
    Bandyopadhyay, Sivaji
    COMPUTER SPEECH AND LANGUAGE, 2023, 82
  • [25] RNALoc-LM: RNA subcellular localization prediction using pre-trained RNA language model
    Zeng, Min
    Zhang, Xinyu
    Li, Yiming
    Lu, Chengqian
    Yin, Rui
    Guo, Fei
    Li, Min
    BIOINFORMATICS, 2025, 41 (04)
  • [26] Rapid seismic damage state prediction of the subway station structure using the pre-trained network and convolutional neural network
    Fan, Yifan
    Chen, Zhiyi
    Luo, Xiaowei
    SOIL DYNAMICS AND EARTHQUAKE ENGINEERING, 2024, 185
  • [27] UniproLcad: Accurate Identification of Antimicrobial Peptide by Fusing Multiple Pre-Trained Protein Language Models
    Wang, Xiao
    Wu, Zhou
    Wang, Rong
    Gao, Xu
    SYMMETRY-BASEL, 2024, 16 (04):
  • [28] Malware Detection using Attributed CFG Generated by Pre-trained Language Model with Graph Isomorphism Network
    Gao, Yun
    Hasegawa, Hirokazu
    Yamaguchi, Yukiko
    Shimada, Hajime
    2022 IEEE 46TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2022), 2022, : 1495 - 1501
  • [29] A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification
    Liu, Hai
    Liu, Yuanxia
    Wong, Leung-Pun
    Lee, Lap-Kei
    Hao, Tianyong
    COMPLEXITY, 2020, 2020
  • [30] Transfer Learning for Mammogram Classification Using Pre-Trained Convolutional Neural Network
    Yasuda, K.
    Tsuru, H.
    Ohki, M.
    MEDICAL PHYSICS, 2017, 44 (06) : 3102 - 3102