A Primal-Dual Data-Driven Method for Computational Optical Imaging with a Photonic Lantern

Carlos Santos Garcia, Mathilde Larcheveque, Solal O'Sullivan, Martin Van Waerebeke, Robert R. Thomson, Audrey Repetti, Jean-Christophe Pesquet

Research output: Contribution to journalArticlepeer-review

29 Downloads (Pure)

Abstract

Optical fibers aim to image in vivo biological processes. In this context, high spatial resolution and stability to fiber movements are key to enable decision-making processes (e.g. for microendoscopy). Recently, a single-pixel imaging technique based on a multicore fiber photonic lantern has been designed, named computational optical imaging using a lantern (COIL). A proximal algorithm based on a sparsity prior, dubbed SARA-COIL, has been further proposed to solve the associated inverse problem, to enable image reconstructions for high resolution COIL microendoscopy. In this work, we develop a data-driven approach for COIL. We replace the sparsity prior in the proximal algorithm by a learned denoiser, leading to a plug-and-play (PnP) algorithm. The resulting PnP method, based on a proximal primal–dual algorithm, enables to solve the Morozov formulation of the inverse problem. We use recent results in learning theory to train a network with desirable Lipschitz properties, and we show that the resulting primal–dual PnP algorithm converges to a solution to a monotone inclusion problem. Our simulations highlight that the proposed data-driven approach improves the reconstruction quality over variational SARA-COIL method on both simulated and real data.
Original languageEnglish
Article numberpgae164
JournalPNAS Nexus
Volume3
Issue number4
Early online date16 Apr 2024
DOIs
Publication statusPublished - Apr 2024

Keywords

  • data-driven prior
  • multicore fiber
  • photonic lantern
  • primal–dual plug-and-play algorithm

Fingerprint

Dive into the research topics of 'A Primal-Dual Data-Driven Method for Computational Optical Imaging with a Photonic Lantern'. Together they form a unique fingerprint.

Cite this