Scalable splitting algorithms for big-data interferometric imaging in the SKA era

Alex Onose, Rafael E. Carrillo, Audrey Repetti, Jason D. McEwen, Jean-Philippe Thiran, Jean-Christophe Pesquet, Yves Wiaux

Research output: Contribution to journalArticle

33 Citations (Scopus)
58 Downloads (Pure)

Abstract

In the context of next generation radio telescopes, like the Square Kilometre Array, the efficient processing of large-scale datasets is extremely important. Convex optimisation tasks under the compressive sensing framework have recently emerged and provide both enhanced image reconstruction quality and scalability to increasingly larger data sets. We focus herein mainly on scalability and propose two new convex optimisation algorithmic structures able to solve the convex optimisation tasks arising in radio-interferometric imaging. They rely on proximal splitting and forward-backward iterations and can be seen, by analogy with the CLEAN major-minor cycle, as running sophisticated CLEAN-like iterations in parallel in multiple data, prior, and image spaces. Both methods support any convex regularisation function, in particular the well studied ℓ1 priors promoting image sparsity in an adequate domain. Tailored for big-data, they employ parallel and distributed computations to achieve scalability, in terms of memory and computational requirements. One of them also exploits randomisation, over data blocks at each iteration, offering further flexibility. We present simulation results showing the feasibility of the proposed methods as well as their advantages compared to state-of-the-art algorithmic solvers. Our Matlab code is available online on GitHub.
Original languageEnglish
Pages (from-to)4314-4335
Number of pages22
JournalMonthly Notices of the Royal Astronomical Society
Volume462
Issue number4
Early online date1 Aug 2016
DOIs
Publication statusPublished - 11 Nov 2016

Fingerprint Dive into the research topics of 'Scalable splitting algorithms for big-data interferometric imaging in the SKA era'. Together they form a unique fingerprint.

Cite this