A low-rank complexity reduction algorithm for the high-dimensional kinetic chemical master equation

被引:1
|
作者
Einkemmer, Lukas [1 ]
Mangott, Julian [1 ]
Prugger, Martina [1 ]
机构
[1] Univ Innsbruck, Dept Math, Innsbruck, Tirol, Austria
关键词
Complexity reduction; Dynamical low-rank approximation; Chemical master equation; High-dimensional problems; Reaction networks; PROJECTOR-SPLITTING INTEGRATOR; TIME INTEGRATION; SOLVER; SWITCH;
D O I
10.1016/j.jcp.2024.112827
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
It is increasingly realized that taking stochastic effects into account is important in order to study biological cells. However, the corresponding mathematical formulation, the chemical master equation (CME), suffers from the curse of dimensionality and thus solving it directly is not feasible for most realistic problems. In this paper we propose a dynamical low -rank algorithm for the CME that reduces the dimensionality of the problem by dividing the reaction network into partitions. Only reactions that cross partitions are subject to an approximation error (everything else is computed exactly). This approach, compared to the commonly used stochastic simulation algorithm (SSA, a Monte Carlo method), has the advantage that it is completely noise -free. This is particularly important if one is interested in resolving the tails of the probability distribution. We show that in some cases (e.g. for the lambda phage) the proposed method can drastically reduce memory consumption and run time and provide better accuracy than SSA.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] Complexity reduction of iterative receivers using low-rank equalization
    Dietl, Guido
    Utschick, Wolfgang
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2007, 55 (03) : 1035 - 1046
  • [32] Adaptive low-rank approximation and denoised Monte Carlo approach for high-dimensional Lindblad equations
    Le Bris, C.
    Rouchon, P.
    Roussel, J.
    PHYSICAL REVIEW A, 2015, 92 (06):
  • [33] Integrating Low-rank Approximation and Word Embedding for Feature Transformation in the High-dimensional Text Classification
    Le Nguyen Hoai Nam
    Ho Bao Quoc
    KNOWLEDGE-BASED AND INTELLIGENT INFORMATION & ENGINEERING SYSTEMS, 2017, 112 : 437 - 446
  • [34] Low-rank separated representation surrogates of high-dimensional stochastic functions: Application in Bayesian inference
    Validi, AbdoulAhad
    JOURNAL OF COMPUTATIONAL PHYSICS, 2014, 260 : 37 - 53
  • [35] LR-GLM: High-Dimensional Bayesian Inference Using Low-Rank Data Approximations
    Trippe, Brian L.
    Huggins, Jonathan H.
    Agrawal, Raj
    Broderick, Tamara
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [36] A low-rank control variate for multilevel Monte Carlo simulation of high-dimensional uncertain systems
    Fairbanks, Hillary R.
    Doostan, Alireza
    Ketelsen, Christian
    Laccarino, Gianluca
    JOURNAL OF COMPUTATIONAL PHYSICS, 2017, 341 : 121 - 139
  • [37] LOW-RANK MATRIX FACTORIZATION FOR DEEP NEURAL NETWORK TRAINING WITH HIGH-DIMENSIONAL OUTPUT TARGETS
    Sainath, Tara N.
    Kingsbury, Brian
    Sindhwani, Vikas
    Arisoy, Ebru
    Ramabhadran, Bhuvana
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 6655 - 6659
  • [38] Low-rank diffusion matrix estimation for high-dimensional time-changed Levy processes
    Belomestny, Denis
    Trabs, Mathias
    ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES, 2018, 54 (03): : 1583 - 1621
  • [39] Rank reduction for high-dimensional generalized additive models
    Lin, Hongmei
    Lian, Heng
    Liang, Hua
    JOURNAL OF MULTIVARIATE ANALYSIS, 2019, 173 : 672 - 684
  • [40] SIMULTANEOUS LOW-RANK COMPONENT AND GRAPH ESTIMATION FOR HIGH-DIMENSIONAL GRAPH SIGNALS: APPLICATION TO BRAIN IMAGING
    Rui, Liu
    Nejati, Hossein
    Safavi, Seyed Hamid
    Cheung, Ngai-Man
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4134 - 4138