Abstract
Unfolded proximal neural networks (PNNs) form a family of methods that combines deep learning and proximal optimization approaches. They consist in designing a neural network for a specific task by unrolling a proximal algorithm for a fixed number of iterations, where linearities can be learned from prior training procedure. PNNs have shown to be more robust than traditional deep learning approaches while reaching at least as good performances, in particular in computational imaging. However, training PNNs still depends on the efficiency of available training algorithms. In this work, we propose a lifted training formulation based on Bregman distances for unfolded PNNs. Leveraging the deterministic mini-batch block-coordinate forward-backward method, we design a bespoke computational strategy beyond traditional back-propagation methods for solving the resulting learning problem efficiently. We assess the behaviour of the proposed training approach for PNNs through numerical simulations on image denoising, considering a denoising PNN whose structure is based on dual proximal-gradient iterations.
Original language | English |
---|---|
Title of host publication | 2024 IEEE 34th International Workshop on Machine Learning for Signal Processing (MLSP) |
Publisher | IEEE |
Number of pages | 6 |
ISBN (Electronic) | 979-8-3503-7225-0 |
DOIs | |
Publication status | Published - 4 Nov 2024 |
Event | 34th IEEE International Workshop on Machine Learning for Signal Processing 2024 - London, United Kingdom Duration: 22 Sept 2024 → 25 Sept 2024 https://2024.ieeemlsp.org/ |
Conference
Conference | 34th IEEE International Workshop on Machine Learning for Signal Processing 2024 |
---|---|
Abbreviated title | MLSP 2024 |
Country/Territory | United Kingdom |
City | London |
Period | 22/09/24 → 25/09/24 |
Internet address |