Observations on boosting feature selection

D. B. Redpath, K. Lebart

Research output: Chapter in Book/Report/Conference proceedingChapter (peer-reviewed)peer-review

5 Citations (Scopus)

Abstract

This paper presents a study of the Boosting Feature Selection (BFS) algorithm [1], a method which incorporates feature selection into Adaboost. Such an algorithm is interesting as it combines the methods studied by Boosting and ensemble feature selection researchers. Observations are made on generalisation, weighted error and error diversity to compare the algorithms performance to Adaboost while using a nearest mean base learner. Ensemble feature prominence is proposed as a stop criterion for ensemble construction. Its quality assessed using the former performance measures. BFS is found to compete with Adaboost in terms of performance, despite the reduced feature description for each base classifer. This is explained using weighted error and error diversity. Results show the proposed stop criterion to be useful for trading ensemble performance and complexity. © Springer-Verlag Berlin Heidelberg 2005.

Original languageEnglish
Title of host publicationMultiple Classifier Systems
Subtitle of host publication6th International Workshop, MCS 2005, Seaside, CA, USA, June 13-15, 2005. Proceedings
Pages32-41
Number of pages10
Volume3541
ISBN (Electronic)978-3-540-31578-0
DOIs
Publication statusPublished - 2005
Event6th International Workshop on Multiple Classifier Systems - Seaside, CA., United States
Duration: 13 Jun 200515 Jun 2005

Publication series

NameLecture Notes in Computer Science
Volume3541
ISSN (Print)0302-9743

Conference

Conference6th International Workshop on Multiple Classifier Systems
Abbreviated titleMCS 2005
CountryUnited States
CitySeaside, CA.
Period13/06/0515/06/05

Fingerprint Dive into the research topics of 'Observations on boosting feature selection'. Together they form a unique fingerprint.

Cite this