Fuzzy Rough Support Vector Machine for Data Classification

Fuzzy Rough Support Vector Machine for Data Classification

Arindam Chaudhuri
Copyright: © 2016 |Pages: 28
DOI: 10.4018/IJFSA.2016040103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In this paper, classification task is performed by FRSVM. It is variant of FSVM and MFSVM. Fuzzy rough set takes care of sensitiveness of noisy samples and handles impreciseness. The membership function is developed as function of cener and radius of each class in feature space. It plays an important role towards sampling the decision surface. The training samples are either linear or nonlinear separable. In nonlinear training samples, input space is mapped into high dimensional feature space to compute separating surface. The different input points make unique contributions to decision surface. The performance of the classifier is assessed in terms of the number of support vectors. The effect of variability in prediction and generalization of FRSVM is examined with respect to values of C. It effectively resolves imbalance and overlapping class problems, normalizes to unseen data and relaxes dependency between features and labels. Experimental results on both synthetic and real datasets support that FRSVM achieves superior performance in reducing outliers' effects than existing SVMs.
Article Preview
Top

1. Introduction

Classification of data (Aggarwal 2014), (Duda, Hart & Stork, 2007) is a common task in machine learning. In this direction support vector machines (SVM) (Burges, 1998) has emerged as a promising pattern classification tool in recent years. It is based on the principle of structural risk minimization (SRM) and statistical learning theory (Vapnik, 1998). SVM was proposed by Vapnik (Vapnik, 1998) and has received much attention from the pattern recognition community (Abe, 2010), (Bishop, 2006), (Duda, Hart & Stork, 2007). It has been widely used in various real life applications with appreciable classification performances (Burges, 1998).

Many complex problems have been solved by SVMs (Abe, 2010). Some notable applications where SVM has been successfully applied are handwritten digit recognition, object recognition, speaker identification, charmed quark detection, face detection, optical character recognition, medical diagnostics, text classification etc. (Abe, 2010). Two important applications where SVM has outperformed other methods are electric load prediction (EUNITE, 2001) and optical character recognition (Tautu & Leon, 2012). For regression estimation SVMs have been compared on benchmark time series prediction tests, the Boston housing problem and (on artificial data) on PET operator inversion problem (Abe, 2010), (Burges, 1998). In most of these cases SVM generalization performance i.e. error rates on test sets either matches or is significantly better than that of the competing methods. The use of SVMs for density estimation and ANOVA decomposition has also been studied (Burges, 1998). Regarding extensions the basic SVMs contain no prior knowledge of the problem. For example, a large class of SVMs for image recognition problem gives the same results if pixels are first permuted randomly with each image suffering the same permutation, an act of vandalism that would leave the best performing neural networks severely handicapped. Although SVMs have good generalization performance they can be abysmally slow in test phase a problem which has been addressed in (Burges, 1998). Several works have generalized the basic ideas of SVM and have shown connections to regularization theory (Abe, 2010), (Burges, 1998). They have also shown how SVM ideas can be incorporated in a wide range of other algorithms (Abe, 2010), (Burges, 1998), (Chaudhuri, De & Chatterjee, 2008), (Chaudhuri & De, 2011), (Chaudhuri, 2014).

Complete Article List

Search this Journal:
Reset
Volume 13: 1 Issue (2024)
Volume 12: 1 Issue (2023)
Volume 11: 4 Issues (2022)
Volume 10: 4 Issues (2021)
Volume 9: 4 Issues (2020)
Volume 8: 4 Issues (2019)
Volume 7: 4 Issues (2018)
Volume 6: 4 Issues (2017)
Volume 5: 4 Issues (2016)
Volume 4: 4 Issues (2015)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing