TY - JOUR
T1 - A Proximal Markov Chain Monte Carlo Method for Bayesian Inference in Imaging Inverse Problems
T2 - When Langevin Meets Moreau
AU - Durmus, Alain
AU - Moulines, Éric
AU - Pereyra, Marcelo
N1 - Funding Information:
\ast Published electronically November 3, 2022. This paper originally appeared in SIAM Journal on Imaging Sciences, Volume 11, Number 1, 2018, pages 473--506, under the title ``Efficient Bayesian Computation by Proximal Markov Chain Monte Carlo: When Langevin Meets Moreau."" https://doi.org/10.1137/22M1522917 Funding: Part of this work was conducted when the third author held a Marie Curie Intra-European Research Fellowship for Career Development at the School of Mathematics of the University of Bristol and was a visiting scholar at the School of Mathematical and Computer Sciences of Heriot-Watt University. \dagger LTCI, Telecom ParisTech, Paris 75013, France ([email protected]). \ddagger Centre de Math\e'matiques Appliqu\e'es, UMR 7641, Ecole Polytechnique, 91128 Palaiseau cedex, France ([email protected]). \S Maxwell Institute for Mathematical Sciences and School of Mathematical and Computer Sciences, Heriot-Watt University, Edinburgh EH14 4AS, UK ([email protected]).
Publisher Copyright:
© 2022 Society for Industrial and Applied Mathematics Publications. All rights reserved.
PY - 2022
Y1 - 2022
N2 - Modern imaging methods rely strongly on Bayesian inference techniques to solve challenging imaging problems. Currently, the predominant Bayesian computational approach is convex optimization, which scales very efficiently to high-dimensional image models and delivers accurate point estimation results. However, in order to perform more complex analyses, for example, image uncertainty quantification or model selection, it is often necessary to use more computationally intensive Bayesian computation techniques such as Markov chain Monte Carlo methods. This paper presents a new and highly efficient Markov chain Monte Carlo methodology to perform Bayesian computation for high-dimensional models that are log-concave and nonsmooth, a class of models that is central in imaging sciences. The methodology is based on a regularized unadjusted Langevin algorithm that exploits tools from convex analysis, namely, Moreau-Yosida envelopes and proximal operators, to construct Markov chains with favorable convergence properties. In addition to scaling efficiently to high dimensions, the method can be applied in a straightforward manner to models that are currently solved using proximal optimization algorithms. We provide a detailed theoretical analysis of the proposed methodology, including asymptotic and nonasymptotic convergence results with easily verifiable conditions, and explicit bounds on the convergence rates. The proposed methodology is demonstrated with five experiments related to image deconvolution and tomographic reconstruction with total-variation and \ell 1 priors, where we conduct a range of challenging Bayesian analyses related to uncertainty quantification, hypothesis testing, and model selection in the absence of ground truth.
AB - Modern imaging methods rely strongly on Bayesian inference techniques to solve challenging imaging problems. Currently, the predominant Bayesian computational approach is convex optimization, which scales very efficiently to high-dimensional image models and delivers accurate point estimation results. However, in order to perform more complex analyses, for example, image uncertainty quantification or model selection, it is often necessary to use more computationally intensive Bayesian computation techniques such as Markov chain Monte Carlo methods. This paper presents a new and highly efficient Markov chain Monte Carlo methodology to perform Bayesian computation for high-dimensional models that are log-concave and nonsmooth, a class of models that is central in imaging sciences. The methodology is based on a regularized unadjusted Langevin algorithm that exploits tools from convex analysis, namely, Moreau-Yosida envelopes and proximal operators, to construct Markov chains with favorable convergence properties. In addition to scaling efficiently to high dimensions, the method can be applied in a straightforward manner to models that are currently solved using proximal optimization algorithms. We provide a detailed theoretical analysis of the proposed methodology, including asymptotic and nonasymptotic convergence results with easily verifiable conditions, and explicit bounds on the convergence rates. The proposed methodology is demonstrated with five experiments related to image deconvolution and tomographic reconstruction with total-variation and \ell 1 priors, where we conduct a range of challenging Bayesian analyses related to uncertainty quantification, hypothesis testing, and model selection in the absence of ground truth.
KW - Bayesian inference
KW - convex optimization
KW - inverse problems
KW - Markov chain Monte Carlo methods
KW - mathematical imaging
KW - model selection
KW - uncertainty quantification
UR - http://www.scopus.com/inward/record.url?scp=85145167952&partnerID=8YFLogxK
U2 - 10.1137/22M1522917
DO - 10.1137/22M1522917
M3 - Article
AN - SCOPUS:85145167952
SN - 0036-1445
VL - 64
SP - 991
EP - 1028
JO - SIAM Review
JF - SIAM Review
IS - 4
ER -