Page 261 - Contributed Paper Session (CPS) - Volume 4
P. 261

CPS2220 David Degras et al.
            1.  Introduction
                Regime-switching state-space models (in short, switching SSMs) form a
            powerful class of time series models used in fields as varied as econometrics
            [7], speech recognition [12], computer vision [1], and tecently neuroimaging
            [11].  Switching SSMs can flexibly  track  nonstationary behavior and identify
            (possibly  low-dimensional)  latent  factors  in  time  series.  These  models  are
            particularly suitable in situations where dependencies between study variables
            are  modulated  by  an  underlying  regime  of  activity.  In  econometrics,  for
            example, such regimes could be “growth cycle” and “recession cycle”.
                Several computational methods are available for switching SSMs. Bayesian
            approaches include Gibbs sampling [8], variational Bayes [4], and sequential
            Monte Carlo [3]. Frequentist approaches are typically based on the maximum
            likelihood estimator (MLE) and the Expectation-Maximization (EM) algorithm
            (2, 5, 7, 13]. In practice, switching SSMs have been mostly applied  to low-
            dimensional  and  telatively  short  time  series.  The  case  of  high-dimensional
            and/or long time series, which is our focus here, poses considerable numerical
            challenges  both  for  model  fitting  and  statistical  inference.  Also,  to  our
            knowledge,  there  are  currently  no  publicly  available  software  packages for
            switching SSMs.
                This work aims to facilitate the implementation of switching SSMs with
            large  datasets. We  study  two  broadly  applicable  switching  SSMs  and  their
            implementation via the EM algorithm. Our contributions are as follows. First,
            we provide two new initialization methods for the EM based on least square
            regression, K-means clustering. and dichotomic search. Indeed, the choice of
            starting  points  is  often  key  to  the  successful  convergence  of  optimization
            algorithms, especially for large datasets and models with many parameters.
            Second, we provide numerical optimization tools to handle constraints on the
            model parameters such as equality constraints, fixed coefficients constraints,
            or scaling constraints.
                Such constraints can prove important both for model interpretability and
            for  numerical  stability  and  convergence  of  the  EM.  Third,  we  develop  a
            parametric  bootstrap  method  for  the  statistical  inference  of  model
            parameters. In our experience, likelihood-based inference is not tractable in
            high-dimensional  switching  SSMs:  the  proposed  bootstrap  offers  a  viable
            alternative that can easily be computed in parallel. Fourth, we implement our
            approach     in   a    suite   of    MATLAB      functions   available   at
            https://github.com/ddegras/svitch-ssa. Applications of our switching SSMs to
            large electro-encephalography (EEG) data from an epilepsy study and a brain
            computer interface study will be presented orally (but not here for reasons of
            space).
                The paper is organized as follows. Section 2 gives a general account of
            switching SSMs and introduces our study models. Section 3 briefly describes

                                                               250 | I S I   W S C   2 0 1 9
   256   257   258   259   260   261   262   263   264   265   266