Case study on bagging stable classifiers for data streams

J.N. van Rijn, G. Holmes, B. Pfahringer, J. Vanschoren

Onderzoeksoutput: Hoofdstuk in Boek/Rapport/CongresprocedureConferentiebijdrageAcademicpeer review

Samenvatting

Ensembles of classifiers are among the strongest classifiers in most data mining applications. Bagging ensembles exploit the instability of base-classifiers by training them on different bootstrap replicates. It has been shown that Bagging instable classifiers, such as decision trees, yield generally good results, whereas bagging stable classifiers, such ask-NN, makes little difference. However, recent work suggests that this cognition applies to the classical batch data mining setting rather than the data stream setting. We present an empirical study that supports this observation.
Originele taal-2Engels
TitelBENELEARN 2015
Aantal pagina's6
StatusGepubliceerd - 2015
EvenementAnnual Belgian-Dutch Conference on Machine Learning (Benelearn 2015) - Delft, Nederland
Duur: 19 jun 201519 jun 2015

Publicatie series

NaamComputing and Mathematical Sciences Papers, University of Waikato.

Congres

CongresAnnual Belgian-Dutch Conference on Machine Learning (Benelearn 2015)
Verkorte titelBenelearn 2015
Land/RegioNederland
StadDelft
Periode19/06/1519/06/15

Vingerafdruk

Duik in de onderzoeksthema's van 'Case study on bagging stable classifiers for data streams'. Samen vormen ze een unieke vingerafdruk.

Citeer dit