This talk has captions. You can remove these by pressing CC on the video toolbar.
Modeling and inference in multivariate time series is central in statistics, signal processing, and machine learning, with applications in social network analysis, biomedical, and finance, to name a few. The linear-Gaussian state-space model is a common way to describe a time series through the evolution of a hidden state, with the advantage of presenting a simple inference procedure due to the celebrated Kalman filter.
A fundamental question when analyzing multivariate sequences is the search for relationships between their entries (or the modeled hidden states), especially when the inherent structure is a directed (causal) graph. In such context, graphical modeling combined with parsimony constraints allows to limit the proliferation of parameters and enables a compact data representation which is easier to interpret. We propose a novel expectation-maximization algorithm for estimating the linear matrix operator in the state equation of a linear-Gaussian state-space model.
This talk was part of the research day Interfaces between Statistics, Machine Learning and AI hosted by the Centre for Statistics and the Bayes Centre.