Oblique decision trees using embedded support vector machines in classifier ensembles

V. Menkovski, I. Christou, S. Efremidis

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademic

11 Citations (Scopus)
276 Downloads (Pure)

Abstract

Classifier ensembles have emerged in recent years as a promising research area for boosting pattern recognition systems' performance. We present a new base classifier that utilizes oblique decision tree technology based on support vector machines for the construction of oblique (non-axis parallel) tests on the nodes of the decision tree inducted. We describe a number of heuristic techniques for enhancing the tree construction process by better estimation of the gain obtained by an oblique split at any tree node. We then show how embedding the new classifier in an ensemble of classifiers using the classical Hedge(beta) algorithm boosts performance of the system. Testing 10-fold cross validation on UCI machine learning repository data sets shows that the new hybrid classifiers outperforms on average by more than 2.1% both the WEKA implementation of C4.5 (J48) and the SMO implementation of SVM in WEKA. The application of the particular ensemble algorithm is an excellent fit for online-learning applications where one seeks to improve performance of self-healing dependable computing systems based on reconfiguration by gradually and adaptively learning what constitutes good system configurations.
Original languageEnglish
Title of host publicationProceedings of the 7th IEEE International Conference on Cybernetic Intelligent Systems 2008, CIS 2008, 9-10 September 2008, London, United Kingdom
Place of PublicationPiscataway
PublisherInstitute of Electrical and Electronics Engineers
Pages11-1/6
ISBN (Print)978-1-4244-2914-1
DOIs
Publication statusPublished - 2008

Fingerprint

Dive into the research topics of 'Oblique decision trees using embedded support vector machines in classifier ensembles'. Together they form a unique fingerprint.

Cite this