Abstract
Image restoration has long been one of the key research topics in image processing. Many mathematical approaches have been developed to solve this problem, e.g., variational methods, wavelet techniques, or Bayesian methods. With the widespread of neural network (NN) models in all the subdomains of data science, the performance limits of these methods are further pushed. One of the most successful strategies consists of plugging NNs in existing optimization algorithms.
However, so doing raises several mathematical and practical challenges. One of the main issues is to secure the convergence of the resulting iterative scheme. Further questions concerning the characterization of the reached limit are also worth being addressed. In this paper, we show that the theory of maximally monotone operators allows us to bring insightful answers to these problems, and design NNs leading to excellent results in terms of global restoration quality.
However, so doing raises several mathematical and practical challenges. One of the main issues is to secure the convergence of the resulting iterative scheme. Further questions concerning the characterization of the reached limit are also worth being addressed. In this paper, we show that the theory of maximally monotone operators allows us to bring insightful answers to these problems, and design NNs leading to excellent results in terms of global restoration quality.
Original language | English |
---|---|
Title of host publication | IEEE ICIP |
Publisher | IEEE |
Number of pages | 5 |
Publication status | Submitted - 21 Jan 2021 |