Comments on "Pruning error minimization in least squares support vector machines"

Anthony Kuh, Philippe De Wilde

Research output: Contribution to journalComment/debatepeer-review

57 Citations (Scopus)

Abstract

In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines"by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (? = 8). This causes a problem as the derivation involves inverting a matrix that is often singular. We discuss a modification of this algorithm that prunes with regularization (? finite and nonzero) and is also computationally more efficient. © 2007 IEEE.

Original languageEnglish
Pages (from-to)606-609
Number of pages4
JournalIEEE Transactions on Neural Networks
Volume18
Issue number2
DOIs
Publication statusPublished - Mar 2007

Keywords

  • Least squares kernel methods
  • Online updating
  • Pruning
  • Regularization

Fingerprint

Dive into the research topics of 'Comments on "Pruning error minimization in least squares support vector machines"'. Together they form a unique fingerprint.

Cite this