Modelling covariance matrices by the trigonometric separation strategy with application to hidden Markov models

Abstract
Bayesian inference on the covariance matrix is usually performed after placing an Inverse-Wishart or a multivariate Jeffreys as a prior density, but both of them, for different reasons, present some drawbacks. As an alternative, the covariance matrix can be modelled by separating out the standard deviations and the correlations. This separation strategy takes advantage of the fact that usually it is more immediate and flexible to set priors on the standard deviations and the correlations rather than on the covariance matrix. On the other hand, the priors must preserve the positive definiteness of the correlation matrix. This can be obtained by considering the Cholesky decomposition of the correlation matrix, whose entries are reparameterized using trigonometric functions. The efficiency of the trigonometric separation strategy (TSS) is shown through an application to hidden Markov models (HMMs), with conditional distributions Multivariate Normal, in the general case of an unknown number of hidden states, which is estimated by a reversible jump Markov chain Monte Carlo algorithm, based on the split-and-combine and birth-and-death moves, whose design is straightforward, because of the use of the TSS. Finally, an example in remote sensing is described, where a HMM containing the TSS is used for the segmentation of a multi-colour satellite image.
Year
2019
Category
Refereed journal
Output Tags
WP 1.2 Water resources and flood risk management (RESAS 2016-21)