A distributed Gibbs Sampler with Hypergraph Structure for High-Dimensional Inverse Problems

Pierre-Antoine Thouvenin, Audrey Repetti, Pierre Chainais

Research output: Contribution to journalArticlepeer-review

16 Downloads (Pure)


Sampling-based algorithms are classical approaches to perform Bayesian inference in inverse problems. They provide estimators with the associated credibility intervals to quantify the uncertainty on the estimators. Although these methods hardly scale to high dimensional problems, they have recently been paired with optimization techniques, such as proximal and splitting approaches, to address this issue. Such approaches pave the way to distributed samplers, splitting computations to make inference more scalable and faster. We introduce a distributed Gibbs sampler to efficiently solve such problems, considering posterior distributions with multiple smooth and non-smooth functions composed with linear operators. The proposed approach leverages a recent approximate augmentation technique reminiscent of primal-dual optimization methods. It is further combined with a block-coordinate approach to split the primal and dual variables into blocks, leading to a distributed block-coordinate Gibbs sampler. The resulting algorithm exploits the hypergraph structure of the involved linear operators to efficiently distribute the variables over multiple workers under controlled communication costs. It accommodates several distributed architectures, such as the Single Program Multiple Data and client-server architectures. Experiments on a large image deblurring problem show the performance of the proposed approach to produce high quality estimates with credibility intervals in a small amount of time.
Original languageEnglish
Number of pages38
JournalJournal of Computational and Graphical Statistics
Publication statusSubmitted - Oct 2022


Dive into the research topics of 'A distributed Gibbs Sampler with Hypergraph Structure for High-Dimensional Inverse Problems'. Together they form a unique fingerprint.

Cite this