A new look at state-space models for neural data

被引:133
|
作者
Paninski, Liam [1 ,2 ]
Ahmadian, Yashar [1 ,2 ]
Ferreira, Daniel Gil [1 ,2 ]
Koyama, Shinsuke [3 ]
Rad, Kamiar Rahnama [1 ,2 ]
Vidne, Michael [1 ,2 ]
Vogelstein, Joshua [4 ]
Wu, Wei [5 ]
机构
[1] Columbia Univ, Dept Stat, New York, NY 10027 USA
[2] Columbia Univ, Ctr Theoret Neurosci, New York, NY USA
[3] Carnegie Mellon Univ, Dept Stat, Pittsburgh, PA 15213 USA
[4] Johns Hopkins Univ, Dept Neurosci, Baltimore, MD USA
[5] Florida State Univ, Dept Stat, Tallahassee, FL 32306 USA
关键词
Neural coding; State-space models; Hidden Markov model; Tridiagonal matrix; RECEPTIVE-FIELD PLASTICITY; HIDDEN MARKOV-MODELS; MAXIMUM-LIKELIHOOD; FIRING PATTERNS; DYNAMIC-ANALYSIS; POPULATION; INHIBITION; RECORDINGS; FRAMEWORK; NETWORK;
D O I
10.1007/s10827-009-0179-x
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in state-space models with non-Gaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these approximations, but that nonetheless retain the computational efficiency of the approximate methods. We discuss a variety of examples, applying these direct optimization techniques to problems in spike train smoothing, stimulus decoding, parameter estimation. and inference of synaptic properties. Along the way, we point out connections to sonic related standard statistical methods, including spline smoothing and isotonic regression. Finally, we note that the computational methods reviewed here do not in fact depend on the state-space setting at all; instead, the key property we are exploiting involves the bandedness of certain matrices. We close by discussing some applications of this more general point of view, including Markov chain Monte Carlo methods for neural decoding and efficient estimation of spatially-varying firing rates.
引用
收藏
页码:107 / 126
页数:20
相关论文
共 50 条
  • [1] A new look at state-space models for neural data
    Liam Paninski
    Yashar Ahmadian
    Daniel Gil Ferreira
    Shinsuke Koyama
    Kamiar Rahnama Rad
    Michael Vidne
    Joshua Vogelstein
    Wei Wu
    Journal of Computational Neuroscience, 2010, 29 : 107 - 126
  • [2] Latent State-Space Models for Neural Decoding
    Aghagolzadeh, Mehdi
    Truccolo, Wilson
    2014 36TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2014, : 3033 - 3036
  • [3] Neural State-Space Models: Empirical Evaluation of Uncertainty Quantification
    Forgione, Marco
    Piga, Dario
    IFAC PAPERSONLINE, 2023, 56 (02): : 4082 - 4087
  • [4] Meta-Learning of Neural State-Space Models Using Data From Similar Systems
    Chakrabarty, Ankush
    Wichern, Gordon
    Laughman, Christopher R.
    IFAC PAPERSONLINE, 2023, 56 (02): : 1490 - 1495
  • [5] Discriminative State-Space Models
    Kuznetsov, Vitaly
    Mohri, Mehryar
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [6] Dynamic state-space models
    Guo, WS
    JOURNAL OF TIME SERIES ANALYSIS, 2003, 24 (02) : 149 - 158
  • [7] Denoising neural data with state-space smoothing: Method and application
    Nalatore, Hariharan
    Ding, Mingzhou
    Rangarajan, Govindan
    JOURNAL OF NEUROSCIENCE METHODS, 2009, 179 (01) : 131 - 141
  • [8] Structured state-space models are deep Wiener modelsStructured state-space models are deep Wiener models
    Bonassi, Fabio
    Andersson, Carl
    Mattsson, Per
    Schon, Thomas B.
    IFAC PAPERSONLINE, 2024, 58 (15): : 247 - 252
  • [9] State-space models of pipelines
    Geiger, Gerhard
    Marko, Drago
    PROCEEDINGS OF THE 17TH IASTED INTERNATIONAL CONFERENCE ON MODELLING AND SIMULATION, 2006, : 56 - +
  • [10] Ordinal Outcome State-Space Models for Intensive Longitudinal Data
    Henry, Teague R.
    Slipetz, Lindley R.
    Falk, Ami
    Qiu, Jiaxing
    Chen, Meng
    PSYCHOMETRIKA, 2024, 89 (04) : 1203 - 1229