Case study on bagging stable classifiers for data streams

J.N. van Rijn, G. Holmes, B. Pfahringer, J. Vanschoren

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review


Ensembles of classifiers are among the strongest classifiers in most data mining applications. Bagging ensembles exploit the instability of base-classifiers by training them on different bootstrap replicates. It has been shown that Bagging instable classifiers, such as decision trees, yield generally good results, whereas bagging stable classifiers, such ask-NN, makes little difference. However, recent work suggests that this cognition applies to the classical batch data mining setting rather than the data stream setting. We present an empirical study that supports this observation.
Original languageEnglish
Title of host publicationBENELEARN 2015
Number of pages6
Publication statusPublished - 2015
EventAnnual Belgian-Dutch Conference on Machine Learning (Benelearn 2015) - Delft, Netherlands
Duration: 19 Jun 201519 Jun 2015

Publication series

NameComputing and Mathematical Sciences Papers, University of Waikato.


ConferenceAnnual Belgian-Dutch Conference on Machine Learning (Benelearn 2015)
Abbreviated titleBenelearn 2015


Dive into the research topics of 'Case study on bagging stable classifiers for data streams'. Together they form a unique fingerprint.

Cite this