Feature space approximation for kernel-based supervised learning

Patrick Gelß*, Stefan Klus, Ingmar Schuster, Christof Schütte

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


We propose a method for the approximation of high- or even infinite-dimensional feature vectors, which play an important role in supervised learning. The goal is to reduce the size of the training data, resulting in lower storage consumption and computational complexity. Furthermore, the method can be regarded as a regularization technique, which improves the generalizability of learned target functions. We demonstrate significant improvements in comparison to the computation of data-driven predictions involving the full training data set. The method is applied to classification and regression problems from different application areas such as image recognition, system identification, and oceanographic time series analysis.

Original languageEnglish
Article number106935
JournalKnowledge-Based Systems
Early online date20 Mar 2021
Publication statusPublished - 7 Jun 2021


  • Dimensionality reduction
  • Feature spaces
  • Kernel-based methods
  • Supervised learning
  • System identification

ASJC Scopus subject areas

  • Software
  • Management Information Systems
  • Information Systems and Management
  • Artificial Intelligence


Dive into the research topics of 'Feature space approximation for kernel-based supervised learning'. Together they form a unique fingerprint.

Cite this