Abstract
This paper addresses the problem of designing efficient sampling moves in order to accelerate the convergence of MCMC methods. The Partially collapsed Gibbs sampler (PCGS) takes advantage of variable reordering, marginalization and trimming to accelerate the convergence of the traditional Gibbs sampler. This work studies two specific moves which allow the convergence of the PCGS to be further improved. It considers a Bayesian model where structured sparsity is enforced using a multivariate Bernoulli Laplacian prior. The posterior distribution associated with this model depends on mixed discrete and continuous random vectors. Due to the discrete part of the posterior, the conventional PCGS gets easily stuck around local maxima. Two Metropolis-Hastings moves based on multiple dipole random shifts and inter-chain proposals are proposed to overcome this problem. The resulting PCGS is applied to EEG source localization. Experiments conducted with synthetic data illustrate the effectiveness of this PCGS with accelerated convergence.
Original language | English |
---|---|
Title of host publication | 2016 IEEE Statistical Signal Processing Workshop (SSP) |
Publisher | IEEE |
ISBN (Electronic) | 9781467378031 |
DOIs | |
Publication status | Published - 25 Aug 2016 |
Event | 19th IEEE Statistical Signal Processing Workshop 2016 - Palma de Mallorca, Spain Duration: 25 Jun 2016 → 29 Jun 2016 |
Conference
Conference | 19th IEEE Statistical Signal Processing Workshop 2016 |
---|---|
Abbreviated title | SSP 2016 |
Country/Territory | Spain |
City | Palma de Mallorca |
Period | 25/06/16 → 29/06/16 |
Keywords
- hierarchical Bayesian model
- MCMC
- Metropolis-Hastings moves
- partially collapsed Gibbs sampler
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Applied Mathematics
- Signal Processing
- Computer Science Applications