Ensemble learning constitutes one of the main di-rections in machine learning and data mining. Ensembles allow us to achieve higher accuracy, which is often not achievable with single models. One technique, which proved to be effective for constructing an ensemble of diverse classifiers, is the use of feature subsets. Among different approaches to ensemble feature selection, genetic search was shown to perform best in many domains. In this paper, a new strategy GAS-SEFS, Genetic Algo-rithm-based Sequential Search for Ensemble Feature Selection, is introduced. Instead of one genetic process, it employs a series of processes, the goal of each of which is to build one base classifier. Ex-periments on 21 data sets are conducted, comparing the new strategy with a previously considered genetic strategy for different ensemble sizes and for five different ensemble integration methods. The experiments show that GAS-SEFS, although being more time-consuming, often builds better ensembles, especially on data sets with larger numbers of features.
|Title of host publication||IJCAI-05 (Proceedings 19th International Joint Conference on Artificial Intelligence, Edinburgh, UK, July 30-August 5, 2005)|
|Editors||L.P. Kaelbling, A. Saffiotti|
|Publisher||Professional Book Center|
|Publication status||Published - 2005|