Classifier ensembles have emerged in recent years as a promising research area for boosting pattern recognition systems' performance. We present a new base classifier that utilizes oblique decision tree technology based on support vector machines for the construction of oblique (non-axis parallel) tests on the nodes of the decision tree inducted. We describe a number of heuristic techniques for enhancing the tree construction process by better estimation of the gain obtained by an oblique split at any tree node. We then show how embedding the new classifier in an ensemble of classifiers using the classical Hedge(beta) algorithm boosts performance of the system. Testing 10-fold cross validation on UCI machine learning repository data sets shows that the new hybrid classifiers outperforms on average by more than 2.1% both the WEKA implementation of C4.5 (J48) and the SMO implementation of SVM in WEKA. The application of the particular ensemble algorithm is an excellent fit for online-learning applications where one seeks to improve performance of self-healing dependable computing systems based on reconfiguration by gradually and adaptively learning what constitutes good system configurations.
|Title of host publication||Proceedings of the 7th IEEE International Conference on Cybernetic Intelligent Systems 2008, CIS 2008, 9-10 September 2008, London, United Kingdom|
|Place of Publication||Piscataway|
|Publisher||Institute of Electrical and Electronics Engineers|
|Publication status||Published - 2008|
Menkovski, V., Christou, I., & Efremidis, S. (2008). Oblique decision trees using embedded support vector machines in classifier ensembles. In Proceedings of the 7th IEEE International Conference on Cybernetic Intelligent Systems 2008, CIS 2008, 9-10 September 2008, London, United Kingdom (pp. 11-1/6). Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/UKRICIS.2008.4798937