Lipschitz-regularized gradient flows and generative particle algorithms for high-dimensional scarce data

Hyemin Gu, Panagiota Birmpa, Yannis Pantazis, Luc Rey-Bellet, Markos A. Katsoulakis

Research output: Contribution to journalArticlepeer-review

5 Downloads (Pure)

Abstract

We build a new class of generative algorithms capable of efficiently learning an arbitrary target distribution from possibly scarce, high-dimensional data and subsequently generate new samples. These generative algorithms are particle-based and are constructed as gradient flows of Lipschitz-regularized Kullback-Leibler or other $f$-divergences, where data from a source distribution can be stably transported as particles, towards the vicinity of the target distribution. As a highlighted result in data integration, we demonstrate that the proposed algorithms correctly transport gene expression data points with dimension exceeding 54K, while the sample size is typically only in the hundreds.
Original languageEnglish
JournalSIAM Journal on Mathematics of Data Science
Publication statusAccepted/In press - 11 Jun 2024

Keywords

  • stat.ML
  • cs.LG
  • 35Q84, 49Q22, 62B10, 65C35, 68T07, 94A17

Fingerprint

Dive into the research topics of 'Lipschitz-regularized gradient flows and generative particle algorithms for high-dimensional scarce data'. Together they form a unique fingerprint.

Cite this