Generalized thresholding and online sparsity-aware learning in a union of subspaces

Konstantinos Slavakis, Yannis Kopsinis, Sergios Theodoridis, Steve McLaughlin

Research output: Contribution to journalArticlepeer-review

15 Citations (Scopus)


This paper considers a sparse signal recovery task in time-varying (time-adaptive) environments. The contribution of the paper to sparsity-aware online learning is threefold; first, a generalized thresholding (GT) operator, which relates to both convex and non-convex penalty functions, is introduced. This operator embodies, in a unified way, the majority of well-known thresholding rules which promote sparsity. Second, a non-convexly constrained, sparsity-promoting, online learning scheme, namely the adaptive projection-based generalized thresholding (APGT), is developed that incorporates the GT operator with a computational complexity that scales linearly to the number of unknowns. Third, the novel family of partially quasi-nonexpansive mappings is introduced as a functional analytic tool for treating the GT operator. By building upon the rich fixed point theory, the previous class of mappings establishes also a link between the GT operator and a union of linear subspaces; a non-convex object which lies at the heart of any sparsity promoting technique, batch or online. Based on this functional analytic framework, a convergence analysis of the APGT is provided. Extensive experiments suggest that the APGT exhibits competitive performance when compared to computationally more demanding alternatives, such as the sparsity-promoting affine projection algorithm (APA)- and recursive least-squares (RLS)-based techniques.
Original languageEnglish
Pages (from-to)3760-3773
Number of pages14
JournalIEEE Transactions on Signal Processing
Issue number15
Early online date21 May 2013
Publication statusPublished - 1 Aug 2013


  • Adaptive signal processing
  • online learning
  • sparsity
  • signal recovery
  • thresholding
  • union of subspaces


Dive into the research topics of 'Generalized thresholding and online sparsity-aware learning in a union of subspaces'. Together they form a unique fingerprint.

Cite this