Intelligent Approach for Enhancing Prediction Issues in Scalable Data Mining

Intelligent Approach for Enhancing Prediction Issues in Scalable Data Mining

Khaled M. Fouad, Tarek Elsheshtawy, Mohamed F. Dawood
Copyright: © 2021 |Pages: 34
DOI: 10.4018/IJSKD.2021040108
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Support vector regression (SVR) is one of the supervised machine learning algorithms that can be exploited for prediction issues. The main enhancement issue of SVR is attempting to select a reliable parameter to assure the high performance of SVR. In this paper, the intelligent approach is based on integrating the enhanced particle swarm optimization PSO with the SVR to achieve the proper SVR parameters that are used to improve SVR performance. The enhanced PSO is performed by implementing parallelized linear time-variant acceleration coefficients (TVAC) and inertia weight (IW) of PSO, called PLTVACIW-PSO. The proposed approach is evaluated by performing the experimental comparisons of the proposed algorithm with eleven different algorithms. These comparisons are performed by applying the proposed algorithm and these algorithms to 21 different datasets varying in their scales.
Article Preview
Top

1. Introduction

Data mining used in knowledge discovery (Hussein et al., 2019), like supervised and unsupervised machine learning techniques (Willis and Strunk, 2017), helps in the decision-making process. Supervised machine learning, like classification (Michael and Constantin, 2002) and the derivative regression, is used in the efficient categorization of data that is based on a set of rules and perspectives. Data must be processed before using it in the classification technique (Kumar et al., 2019).

Prediction is considered a major application of supervised machine learning algorithms (Zaho et al., 2016) and depends mainly on well-optimized parameters concerning supervised data mining algorithm. The applicability of prediction models could be found in electricity price estimation (Shrivastava & Khosravi., 2014) that helps in managing uncertainty and fraud that could be found in electricity prices. Unsupervised machine learning techniques, like clustering (Fouad and Dawood, 2016), aims to divide the dataset into partitions called clusters.

The swarm intelligence principle (Eberhart et al., 2001) was presented in the intelligence computation domain in (Beni and Wang, 1989), which was inspired by the activities in neurosciences and behavioral sciences, as an intelligent paradigm to handle issues, especially in the optimization domain, without a global model provision. A swarm (Ab-Wahab et al., 2015) is a population of identical, agents accomplishing incipient tasks and reacting among themselves, and their ambiance, without lacking central domination. In this situation, particle swarm optimization PSO, primarily presented by (Kennedy and Eberhart, 1995, Blum and Merkle, 2008), is a meta-heuristic global optimization method, which is related to the tribe of algorithms based on the swarm intelligence concept. The PSO technique, which was presented by Kennedy and Eberhart (Kennedy et al., 1995), is a metaheuristic algorithm assorted in swarm intelligence methodologies. PSO emulates the harmonious disposal of flocks moving for birds and fish that convey information across the group to assist the decision-making process in a synchronized way. While the flocks are moving to discover food, each particle detects its position and speed.

In this paper, the enhanced prediction through scalable data mining is performed by proposing an intelligent approach. The proposed approach aims at integrating the enhanced PSO with the SVR to provide an effective prediction process. The enhanced PSO, called PLTVACIW-PSO, is based on Parallelized Linear Time-Variant Acceleration Coefficients (TVAC) and Inertia Weight (IW) of PSO. The proposed optimization algorithm, PLTVACIW-PSO, is exploited to optimize and adjust SVR parameters, PLTVACIW-PSO-SVR. PLTVACIW-PSO-SVR is evaluated by performing the experimental comparisons of the proposed algorithm with eleven different algorithms, which are shown in Table 1. The experimental comparisons are achieved by applying the proposed algorithm and these algorithms to twenty-one different datasets varying in their scales. Furthermore, the evaluation considers the execution time to prove that PLTVACIW-PSO-SVR is performed efficiently on different datasets compared with the other eleven algorithms.

Table 1.
Algorithms
No.Algorithm NameShort NameCategoryExperiment PhaseReferenced or ImplementedDescription
1Linear Inertia Weight PSOLIW-PSOIW Family1 and 2Zheng et al., 2003Inertia weight w is varying overtime linearly.
2Parallelized Linear Inertia Weight PSOPLIW-PSO1 and 2implementedA parallelized version of LIW-PSO
3Non-Linear Inertia Weight PSONLIW-PSO1 and 2Chatterjee et al., 2006Inertia weight w is varying non-linearly overtime.
4Parallelized Non-Linear Inertia Weight PSOPNLIW-PSO1 and 2implementedA parallelized version of NLIW-PSO
5Linear Time-Variant PSOLTV-PSOTVAC Family1 and 2Ratnaweera et al., 2004PSO acceleration variables c1 and c2 are varying overtime linearly.
6Parallelized Linear Time-Variant PSOPLTV-PSO1 and 2Fouad et al., 2017A parallelized version of LTV-PSO
7Non-Linear Time-Variant PSONLTV-PSO1 and 2Chen et al., 2009PSO acceleration variables c1 and c2 are varying non-linearly overtime.
8Parallelized Non-Linear Time-Variant PSOPNLTV-PSO1 and 2Fouad et al., 2017A parallelized version of NLTV-PSO
9Particle Swarm OptimizationPSOTraditional Algorithms1 and 2Kennedy and Eberhart., 1995An algorithm is inspired by the behavior of bird swarms and fish.
10Differential EvolutionDE1 and 2Price et al., 2006A modified version of genetic algorithms
11Genetic AlgorithmsGA1 and 2Coley., 1999A natural-inspired optimization algorithm based on human gens and its cross over and mutation

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024)
Volume 15: 1 Issue (2023)
Volume 14: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing