Efficient Decision Trees for Multi-class Support Vector Machines Using Entropy and Generalization Error Estimation

08/28/2017
by   Pittipol Kantavat, et al.
0

We propose new methods for Support Vector Machines (SVMs) using tree architecture for multi-class classi- fication. In each node of the tree, we select an appropriate binary classifier using entropy and generalization error estimation, then group the examples into positive and negative classes based on the selected classi- fier and train a new classifier for use in the classification phase. The proposed methods can work in time complexity between O(log2N) to O(N) where N is the number of classes. We compared the performance of our proposed methods to the traditional techniques on the UCI machine learning repository using 10-fold cross-validation. The experimental results show that our proposed methods are very useful for the problems that need fast classification time or problems with a large number of classes as the proposed methods run much faster than the traditional techniques but still provide comparable accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2017

Randomized Kernel Methods for Least-Squares Support Vector Machines

The least-squares support vector machine is a frequently used kernel met...
research
10/23/2018

DCSVM: Fast Multi-class Classification using Support Vector Machines

We present DCSVM, an efficient algorithm for multi-class classification ...
research
09/11/2013

Enhancements of Multi-class Support Vector Machine Construction from Binary Learners using Generalization Performance

We propose several novel methods for enhancing the multi-class SVMs by a...
research
04/07/2011

Negative Example Aided Transcription Factor Binding Site Search

Computational approaches to transcription factor binding site identifica...
research
08/31/2016

Measuring the Quality of Exercises

This work explores the problem of exercise quality measurement since it ...

Please sign up or login with your details

Forgot password? Click here to reset