Comments on "Pruning error minimization in least squares support vector machines"

Anthony Kuh, Philippe De Wilde

Research output: Contribution to journalComment/debate

Abstract

In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines"by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (? = 8). This causes a problem as the derivation involves inverting a matrix that is often singular. We discuss a modification of this algorithm that prunes with regularization (? finite and nonzero) and is also computationally more efficient. © 2007 IEEE.

Original languageEnglish
Pages (from-to)606-609
Number of pages4
JournalIEEE Transactions on Neural Networks
Volume18
Issue number2
DOIs
Publication statusPublished - Mar 2007

Fingerprint

Least-Squares Analysis
Support Vector Machine

Keywords

  • Least squares kernel methods
  • Online updating
  • Pruning
  • Regularization

Cite this

Kuh, Anthony ; De Wilde, Philippe. / Comments on "Pruning error minimization in least squares support vector machines". In: IEEE Transactions on Neural Networks. 2007 ; Vol. 18, No. 2. pp. 606-609.
@article{bc8f9ce3ff894b639c99841842d5774b,
title = "Comments on {"}Pruning error minimization in least squares support vector machines{"}",
abstract = "In this letter, we comment on {"}Pruning Error Minimization in Least Squares Support Vector Machines{"}by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (? = 8). This causes a problem as the derivation involves inverting a matrix that is often singular. We discuss a modification of this algorithm that prunes with regularization (? finite and nonzero) and is also computationally more efficient. {\circledC} 2007 IEEE.",
keywords = "Least squares kernel methods, Online updating, Pruning, Regularization",
author = "Anthony Kuh and {De Wilde}, Philippe",
year = "2007",
month = "3",
doi = "10.1109/TNN.2007.891590",
language = "English",
volume = "18",
pages = "606--609",
journal = "IEEE Transactions on Neural Networks",
issn = "1045-9227",
publisher = "IEEE",
number = "2",

}

Comments on "Pruning error minimization in least squares support vector machines". / Kuh, Anthony; De Wilde, Philippe.

In: IEEE Transactions on Neural Networks, Vol. 18, No. 2, 03.2007, p. 606-609.

Research output: Contribution to journalComment/debate

TY - JOUR

T1 - Comments on "Pruning error minimization in least squares support vector machines"

AU - Kuh, Anthony

AU - De Wilde, Philippe

PY - 2007/3

Y1 - 2007/3

N2 - In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines"by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (? = 8). This causes a problem as the derivation involves inverting a matrix that is often singular. We discuss a modification of this algorithm that prunes with regularization (? finite and nonzero) and is also computationally more efficient. © 2007 IEEE.

AB - In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines"by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (? = 8). This causes a problem as the derivation involves inverting a matrix that is often singular. We discuss a modification of this algorithm that prunes with regularization (? finite and nonzero) and is also computationally more efficient. © 2007 IEEE.

KW - Least squares kernel methods

KW - Online updating

KW - Pruning

KW - Regularization

UR - http://www.scopus.com/inward/record.url?scp=34047118751&partnerID=8YFLogxK

U2 - 10.1109/TNN.2007.891590

DO - 10.1109/TNN.2007.891590

M3 - Comment/debate

C2 - 17385646

VL - 18

SP - 606

EP - 609

JO - IEEE Transactions on Neural Networks

JF - IEEE Transactions on Neural Networks

SN - 1045-9227

IS - 2

ER -