Large-scale machine learning for metagenomics sequence classification

被引:52
|
作者
Vervier, Kevin [1 ,2 ,3 ,4 ]
Mahe, Pierre [1 ]
Tournoud, Maud [1 ]
Veyrieras, Jean-Baptiste [1 ]
Vert, Jean-Philippe [2 ,3 ,4 ]
机构
[1] bioMerieux, Bioinformat Res Dept, F-69280 Marcy Letoile, France
[2] PSL Res Univ, CBIO Ctr Computat Biol, MINES ParisTech, F-77300 Fontainebleau, France
[3] Inst Curie, F-75248 Paris, France
[4] INSERM U900, F-75248 Paris, France
基金
欧洲研究理事会;
关键词
D O I
10.1093/bioinformatics/btv683
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Motivation: Metagenomics characterizes the taxonomic diversity of microbial communities by sequencing DNA directly from an environmental sample. One of the main challenges in metagenomics data analysis is the binning step, where each sequenced read is assigned to a taxonomic clade. Because of the large volume of metagenomics datasets, binning methods need fast and accurate algorithms that can operate with reasonable computing requirements. While standard alignment-based methods provide state-of-the-art performance, compositional approaches that assign a taxonomic class to a DNA read based on the k-mers it contains have the potential to provide faster solutions. Results: We propose a new rank-flexible machine learning-based compositional approach for taxonomic assignment of metagenomics reads and show that it benefits from increasing the number of fragments sampled from reference genome to tune its parameters, up to a coverage of about 10, and from increasing the k-mer size to about 12. Tuning the method involves training machine learning models on about 10(8) samples in 10(7) dimensions, which is out of reach of standard softwares but can be done efficiently with modern implementations for large-scale machine learning. The resulting method is competitive in terms of accuracy with well-established alignment and composition-based tools for problems involving a small to moderate number of candidate species and for reasonable amounts of sequencing errors. We show, however, that machine learning-based compositional approaches are still limited in their ability to deal with problems involving a greater number of species and more sensitive to sequencing errors. We finally show that the new method outperforms the state-of-the-art in its ability to classify reads from species of lineage absent from the reference database and confirm that compositional approaches achieve faster prediction times, with a gain of 2-17 times with respect to the BWA-MEM short read mapper, depending on the number of candidate species and the level of sequencing noise.
引用
收藏
页码:1023 / 1032
页数:10
相关论文
共 50 条
  • [31] A review of Nystrom methods for large-scale machine learning
    Sun, Shiliang
    Zhao, Jing
    Zhu, Jiang
    INFORMATION FUSION, 2015, 26 : 36 - 48
  • [32] Introduction to Special Issue on Large-Scale Machine Learning
    Hsu, Chun-Nan
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
  • [33] Large-Scale Strategic Games and Adversarial Machine Learning
    Alpcan, Tansu
    Rubinstein, Benjamin I. P.
    Leckie, Christopher
    2016 IEEE 55TH CONFERENCE ON DECISION AND CONTROL (CDC), 2016, : 4420 - 4426
  • [34] Dynamic Control Flow in Large-Scale Machine Learning
    Yu, Yuan
    Abadi, Martin
    Barham, Paul
    Brevdo, Eugene
    Burrows, Mike
    Davis, Andy
    Dean, Jeff
    Ghemawat, Sanjay
    Harley, Tim
    Hawkins, Peter
    Isard, Michael
    Kudlur, Manjunath
    Monga, Rajat
    Murray, Derek
    Zheng, Xiaoqiang
    EUROSYS '18: PROCEEDINGS OF THE THIRTEENTH EUROSYS CONFERENCE, 2018,
  • [35] Large-Scale Machine Learning Approaches for Molecular Biophysics
    Ramanathan, Arvind
    Chennubhotla, Chakra S.
    Agarwal, Pratul K.
    Stanley, Christopher B.
    BIOPHYSICAL JOURNAL, 2015, 108 (02) : 370A - 370A
  • [36] Large-Scale Machine Learning at Verizon: Theory and Applications
    Srivastava, Ashok
    KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 417 - 417
  • [37] Large-Scale Machine Learning with Stochastic Gradient Descent
    Bottou, Leon
    COMPSTAT'2010: 19TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL STATISTICS, 2010, : 177 - 186
  • [38] Compressed linear algebra for large-scale machine learning
    Elgohary, Ahmed
    Boehm, Matthias
    Haas, Peter J.
    Reiss, Frederick R.
    Reinwald, Berthold
    VLDB JOURNAL, 2018, 27 (05): : 719 - 744
  • [39] Large-Scale Image Classification Using Active Learning
    Alajlan, Naif
    Pasolli, Edoardo
    Melgani, Farid
    Franzoso, Andrea
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2014, 11 (01) : 259 - 263
  • [40] Angel: a new large-scale machine learning system
    Jie Jiang
    Lele Yu
    Jiawei Jiang
    Yuhong Liu
    Bin Cui
    National Science Review, 2018, 5 (02) : 216 - 236