Abstract
From early image processing to modern computational imaging, successful models and algorithms have relied on a fundamental property of natural signals: symmetry . Here symmetry refers to the invariance property of signal sets to transformations, such as translation, rotation, or scaling. Symmetry can also be incorporated into deep neural networks (DNNs) in the form of equivariance, allowing for more data-efficient learning. While there have been important advances in the design of end-to-end equivariant networks for image classification in recent years, computational imaging introduces unique challenges for equivariant network solutions since we typically only observe the image through some noisy ill-conditioned forward operator that itself may not be equivariant. We review the emerging field of equivariant imaging (EI) and show how it can provide improved generalization and new imaging opportunities. Along the way, we show the interplay between the acquisition physics and group actions and links to iterative reconstruction, blind compressed sensing, and self-supervised learning.
Original language | English |
---|---|
Pages (from-to) | 134-147 |
Number of pages | 14 |
Journal | IEEE Signal Processing Magazine |
Volume | 40 |
Issue number | 1 |
Early online date | 29 Dec 2022 |
DOIs | |
Publication status | Published - Jan 2023 |
Keywords
- Inverse problems
- equivariant imaging
- equivariant neural networks
- deep learning
- Computational imaging
- unrolled network
- Unsupervised Learning
ASJC Scopus subject areas
- Signal Processing
- Applied Mathematics
- Electrical and Electronic Engineering