Retention Time Prediction with Message-Passing Neural Networks

被引:20
|
作者
Osipenko, Sergey [1 ]
Nikolaev, Eugene [1 ]
Kostyukevich, Yury [1 ]
机构
[1] Skolkovo Inst Sci & Technol, Ctr Computat & Data Intens Sci & Engn, Nobel Str 3, Moscow 121205, Russia
关键词
retention time prediction; untargeted metabolomics; small molecules; message-passing neural networks; DIFFERENT GRADIENTS; FLOW-RATES; IDENTIFICATION; PERFORMANCE;
D O I
10.3390/separations9100291
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Retention time prediction, facilitated by advances in machine learning, has become a useful tool in untargeted LC-MS applications. State-of-the-art approaches include graph neural networks and 1D-convolutional neural networks that are trained on the METLIN small molecule retention time dataset (SMRT). These approaches demonstrate accurate predictions comparable with the experimental error for the training set. The weak point of retention time prediction approaches is the transfer of predictions to various systems. The accuracy of this step depends both on the method of mapping and on the accuracy of the general model trained on SMRT. Therefore, improvements to both parts of prediction workflows may lead to improved compound annotations. Here, we evaluate capabilities of message-passing neural networks (MPNN) that have demonstrated outstanding performance on many chemical tasks to accurately predict retention times. The model was initially trained on SMRT, providing mean and median absolute cross-validation errors of 32 and 16 s, respectively. The pretrained MPNN was further fine-tuned on five publicly available small reversed-phase retention sets in a transfer learning mode and demonstrated up to 30% improvement of prediction accuracy for these sets compared with the state-of-the-art methods. We demonstrated that filtering isomeric candidates by predicted retention with the thresholds obtained from ROC curves eliminates up to 50% of false identities.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Polarized message-passing in graph neural networks
    He, Tiantian
    Liu, Yang
    Ong, Yew-Soon
    Wu, Xiaohu
    Luo, Xin
    ARTIFICIAL INTELLIGENCE, 2024, 331
  • [2] Hierarchical message-passing graph neural networks
    Zhong, Zhiqiang
    Li, Cheng-Te
    Pang, Jun
    DATA MINING AND KNOWLEDGE DISCOVERY, 2023, 37 (01) : 381 - 408
  • [3] Hierarchical message-passing graph neural networks
    Zhiqiang Zhong
    Cheng-Te Li
    Jun Pang
    Data Mining and Knowledge Discovery, 2023, 37 : 381 - 408
  • [4] GR-pKa: a message-passing neural network with retention mechanism for pKa prediction
    Miao, Runyu
    Liu, Danlin
    Mao, Liyun
    Chen, Xingyu
    Zhang, Leihao
    Yuan, Zhen
    Shi, Shanshan
    Li, Honglin
    Li, Shiliang
    BRIEFINGS IN BIOINFORMATICS, 2024, 25 (05)
  • [5] COMPETITIVE NEURAL NETWORKS ON MESSAGE-PASSING PARALLEL COMPUTERS
    CECCARELLI, M
    PETROSINO, A
    VACCARO, R
    CONCURRENCY-PRACTICE AND EXPERIENCE, 1993, 5 (06): : 449 - 470
  • [6] SIMULATING MODULAR NEURAL NETWORKS ON MESSAGE-PASSING MULTIPROCESSORS
    TOLLENAERE, T
    ORBAN, GA
    PARALLEL COMPUTING, 1991, 17 (4-5) : 361 - 379
  • [7] MAPPING NEURAL NETWORKS ONTO MESSAGE-PASSING MULTICOMPUTERS
    GHOSH, J
    HWANG, K
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 1989, 6 (02) : 291 - 330
  • [8] Message-Passing Neural Networks Learn Little's Law
    Rusek, Krzysztof
    Cholda, Piotr
    IEEE COMMUNICATIONS LETTERS, 2019, 23 (02) : 274 - 277
  • [9] Communication prediction in message-passing multiprocessors
    Afsahi, A
    Dimopoulos, NJ
    HIGH PERFORMANCE COMPUTING SYSTEMS AND APPLICATIONS, 2002, 657 : 253 - 271
  • [10] Message-passing neural networks for high-throughput polymer screening
    St John, Peter C.
    Phillips, Caleb
    Kemper, Travis W.
    Wilson, A. Nolan
    Guan, Yanfei
    Crowley, Michael F.
    Nimlos, Mark R.
    Larsen, Ross E.
    JOURNAL OF CHEMICAL PHYSICS, 2019, 150 (23):