A Binary PSO-Based Model Selection for Novel Smooth Twin Support Vector Regression

A Binary PSO-Based Model Selection for Novel Smooth Twin Support Vector Regression

Huajuan Huang, Xiuxi Wei, Yongquan Zhou
Copyright: © 2022 |Pages: 19
DOI: 10.4018/IJSIR.302615
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The recently proposed smooth twin support vector regression, denoted by STSVR, gains better training speed compared with twin support vector regression (TSVR). In the STSVR, sigmoid function is used for the smooth function, however, its approximation precision is relatively low, leading to the generalization performance of STSVR is not good enough. Moreover, STSVR has at least three parameters that need regulating, which affects its practical applications. In this paper, we increase the regression performance of STSVR from two aspects. First, by introducing Chen-Harker-Kanzow-Smale (CHKS) function, a new smooth version for TSVR, termed as smooth CHKS twin support vector regression (SCTSVR) is proposed. Second, a binary particle swarm optimization (PSO)-based model selection for SCTSVR is suggested. Computational results on one synthetic as well as several benchmark datasets confirm the great improvements on the training process of proposed algorithm.
Article Preview
Top

1. Introduction

Support vector machine (SVM) presented by Vapnik and co-worker (Ding, 2012) is a computationally powerful kernel-based tool for binary data classification and regression. Because its theory is based on the idea of structural risk minimization principle, SVM has successfully solved the high dimensionality and local minimum problems. Therefore, compared with other machine learning methods, such as artificial neural network (Adriana, 2020; Jing, 2013), SVM owns better generalization ability. Within a few years after its introduction SVM has played excellent performance on many real-world predictive data mining applications such as text categorization (Liu, 2020), time series prediction (Chen, 2012), pattern recognition (Tang, 2020) and image processing (Lo, 2012), etc.

However, the computational complexity of SVM in training stage is too expensive, i.e.,IJSIR.302615.m01, where IJSIR.302615.m02 is the total size of the training samples. To overcome this problem, so far, many improved algorithms for reducing the computational complexity of SVM have been presented, such as chunking algorithm (Cortes, 1995), decomposition algorithm (Osuna, 1997) and sequential minimal optimization (SMO) (Platt, 1999), etc. On the other hand, many researchers have proposed some deformation algorithms based on the standard SVM. For example, in 2006, Mangasarian et al. (Mangasarian, 2006) proposed a nonparallel plane classifier for binary data classification, named the generalized eigenvalue proximal support vector machine (GEPSVM). The essence of GEPSVM is to look for two nonparallel planes, so that data points of each class are proximal to one of them. GEPSVM has good learning speed because it solves two generalized eigenvalue problems of the order of input space dimension, but its classification accuracy is low. In 2007, Jayadeva et al. (Jayadeva, 2007) proposed a new machine learning method called twin support vector machine (TWSVM) for the binary classification in the spirit of GEPSVM. TWSVM would generate two non-parallel planes, such that each plane is closer to one of the two classes and is as far as possible from the other. In TWSVM, a pair of smaller sized quadratic programming problems (QPPs) are solved, instead of solving single large one in SVM, makes the computational speed of TWSVM approximately 4 times faster than the traditional SVM. At present, TWSVM has become one of the popular methods because of its low computational complexity. So far, many variants of TWSVM have been proposed by Wang (2013), Shao (2013) and Peng (2013). Certainly, TWSVM is suitable to the classification problems.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 3 Issues (2023)
Volume 13: 4 Issues (2022)
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing