Limiting relaxation times from Markov state models

被引:7
|
作者
Kells, Adam [1 ]
Annibale, Alessia [2 ]
Rosta, Edina [1 ]
机构
[1] Kings Coll London, Dept Chem, London SE1 1DB, England
[2] Kings Coll London, Dept Math, London WC2R 2LS, England
来源
JOURNAL OF CHEMICAL PHYSICS | 2018年 / 149卷 / 07期
基金
英国工程与自然科学研究理事会;
关键词
COARSE MASTER-EQUATIONS; HISTOGRAM ANALYSIS; FREE-ENERGIES; DYNAMICS; KINETICS;
D O I
10.1063/1.5027203
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
Markov state models (MSMs) are more and more widely used in the analysis of molecular simulations to incorporate multiple trajectories together and obtain more accurate time scale information of the slowest processes in the system. Typically, however, multiple lagtimes are used and analyzed as input parameters, yet convergence with respect to the choice of lagtime is not always possible. Here, we present a simple method for calculating the slowest relaxation time (RT) of the system in the limit of very long lagtimes. Our approach relies on the fact that the second eigenvector's autocorrelation function of the propagator will be approximately single exponential at long lagtimes. This allows us to obtain a simple equation for the behavior of the MSM's relaxation time as a function of the lagtime with only two free parameters, one of these being the RT of the system. We demonstrate that the second parameter is a useful indicator of how Markovian a selected variable is for building the MSM. Fitting this function to data gives a limiting value for the optimal variational RT. Testing this on analytic and molecular dynamics data for AlaS and umbrella sampling-biased ion channel simulations shows that the function accurately describes the behavior of the RT and furthermore that this RT can improve noticeably the value calculated at the longest accessible lagtime. We compare our RT limit to the hidden Markov model (HMM) approach that typically finds RTs of comparable values. However, HMMs cannot be used in conjunction with biased simulation data, requiring more complex algorithms to construct than MSMs, and the derived RTs are not variational, leading to ambiguity in the choice of lagtime at which to build the HMM. Published by AIP Publishing.
引用
收藏
页数:8
相关论文
共 50 条
  • [21] A critical appraisal of Markov state models
    Ch. Schütte
    M. Sarich
    The European Physical Journal Special Topics, 2015, 224 : 2445 - 2462
  • [22] Software for Building Markov State Models
    Bowman, Gregory R.
    Noe, Frank
    INTRODUCTION TO MARKOV STATE MODELS AND THEIR APPLICATION TO LONG TIMESCALE MOLECULAR SIMULATION, 2014, 797 : 139 - 139
  • [23] Markov state models and molecular alchemy
    Schuette, Christof
    Nielsen, Adam
    Weber, Marcus
    MOLECULAR PHYSICS, 2015, 113 (01) : 69 - 78
  • [24] Markov state models based on milestoning
    Schuette, Christof
    Noe, Frank
    Lu, Jianfeng
    Sarich, Marco
    Vanden-Eijnden, Eric
    JOURNAL OF CHEMICAL PHYSICS, 2011, 134 (20):
  • [25] Deep Generative Markov State Models
    Wu, Hao
    Mardt, Andreas
    Pasquali, Luca
    Noe, Frank
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [26] Dynamical coring of Markov state models
    Nagel, Daniel
    Weber, Anna
    Lickert, Benjamin
    Stock, Gerhard
    JOURNAL OF CHEMICAL PHYSICS, 2019, 150 (09):
  • [27] Dimensional reduction of Markov state models from renormalization group theory
    Onoli, S.
    Faccioli, P.
    JOURNAL OF CHEMICAL PHYSICS, 2016, 145 (12):
  • [28] Markov state models from hierarchical density-based assignment
    Mitxelena, Ion
    Lopez, Xabier
    de Sancho, David
    JOURNAL OF CHEMICAL PHYSICS, 2021, 155 (05):
  • [29] POLYMER RELAXATION TIMES FROM BIREFRINGENCE RELAXATION MEASUREMENTS
    THOMPSON, DS
    GILL, SJ
    JOURNAL OF CHEMICAL PHYSICS, 1967, 47 (12): : 5008 - &
  • [30] Multi-state Markov models for disease progression in the presence of informative examination times: An application to hepatitis C
    Sweeting, M. J.
    Farewell, V. T.
    De Angelis, D.
    STATISTICS IN MEDICINE, 2010, 29 (11) : 1161 - 1174