ELSA: Equivariant Self-Supervised Adaptation for Small Data Learning

Research output: Contribution to journalArticlepeer-review

8 Downloads (Pure)

Abstract

Learning with small datasets presents a significant challenge due to the low volume of data, and currently, there is no effective and unified framework for small data learning. Although recent progress in transfer learning and generative models has made great strides in machine learning tasks for small data scenarios, these approaches rely on prior information from large-scale datasets gathered from similar tasks or distributions, making small data learning a false proposition. In contrast, our study aims to answer the challenging question: is it possible to learn with small data alone where no data-driven priors are available? Inspired by the emerging field of equivariant imaging, we showcase that the invariance ubiquitous in the data can derive the equivariance of self-supervised reconstruction functions which can enable neural networks to learn with small data alone. Based on this observation, we propose a conceptual simple Equivariant Self-Supervised Adaptation (ELSA) for small data learning. ELSA can be used as a plug-and-play prior for various small data learning tasks. In this work, we conduct experiments on three representative tasks, including small data classification, image inpainting, and style transfer. Experimental results on multiple benchmark datasets demonstrate the effectiveness and efficiency of our method for small-data learning. Code will be made available at https://github.com/cdkll/ELSA.git.
Original languageEnglish
Article number11230092
JournalIEEE Transactions on Artificial Intelligence
Early online date5 Nov 2025
DOIs
Publication statusE-pub ahead of print - 5 Nov 2025

Keywords

  • Small data learning
  • invariance
  • equivariance
  • regularization
  • self-supervised learning

Fingerprint

Dive into the research topics of 'ELSA: Equivariant Self-Supervised Adaptation for Small Data Learning'. Together they form a unique fingerprint.

Cite this