DE/EI - A New Differential Evolution Selection Operator Based on Entropy and Index for Feature Ranking: DE/EI Selection Operator

DE/EI - A New Differential Evolution Selection Operator Based on Entropy and Index for Feature Ranking: DE/EI Selection Operator

Rashmi Behal Gandhi (USICT, GGSIPU, Dwarka, New Delhi, India) and Udayan Ghose (USICT,GGSIPu, Delhi, India)
Copyright: © 2020 |Pages: 14
DOI: 10.4018/IJIRR.2020100105
OnDemand PDF Download:
No Current Special Offers


The goal of feature ranking is to find the optimal list of features. The feature ranking methods use different search techniques to select features. An optimal feature selection results in an optimal feature ranking list, so, it is necessary to use a stochastic search method to select features. In this article, a new DE selection operator is introduced. To know the value of the features its fitness function is calculated using Shannon and singular value decomposition entropy. The index of the selected feature is computed by JI, MCI, and AA Index to know the feature list stability. Hence, DE/EI parent selection operator is proposed. The six fitness functions: SMCI, SVDMCI, SJI,SVDJI, SAA, and SVDAA, are thoroughly tested on ten UCI data sets and their performance is measured with different classifiers like Naive Bayes and Support Vector Machine. The experimental results show that the proposed method can efficiently be consolidated into any evolution that is based on a parent selection framework.
Article Preview


Feature ranking is a process to choose informative features. Optimal features are required to improve scalability, efficiency and accuracy of classification process (Han, Pei, & Kamber, 2011). Around the search methods that contend to diminish the computational burden with optimism, the DE and its ancestors have been receiving the most attention. The operation of the heuristic search concedes the user to fetch the optimal features describing the feature set optimization procedure. Singular value decomposition (SVD) is a linear map of the informative data from the exhaustive data.

This informative data contains reduce Eigen features and Eigenvector space. During reduction Eigen feature is represented in the form of diagonalized data and eigenvector in analogous Eigen expression level to imply their corresponding importance. With evaluation measures of SVD matrix relevant features are found. Singular value decomposition entropy (SVDE) explores the optimal solution and ranks features (Alter, Brown, & Botstein, 2000). The SVDE represents randomness of a data set with single Eigenvector. The entropy value illustrates the consistent Eigenvalues. The difference between Shannon entropy (SE) and SVDE calculation is due to probabilities and the distribution of Eigenvalues. There is a supreme role of features with maximum fitness function, although, ranked features are derived by the entropy fitness function. A good direction will lead to population move to optimal. The proposed selection fitness operator is designed to handle premature convergence of DE algorithm. As standard operator favors in generating individuals possessing a high fitness to deteriotion of others. So, the proposed operator believes that subpopulation with the highest fitness have largest population sizes. The convergence of the operator reduces the selection pressure and this handles a lower premature convergence rate. Each target vector generates trial vector by using different index techniques. The selection is considered more powerful than mutation, this operator can also help in improvement of mutation and cross over with care of handling data. This is useful in different areas like feature ranking, missing data, imputation, optimization function, software engineering, software test generations. This paper is organized as follows: Section 2 discusses the related work. Section 3 depicts the proposed ranking method with SE and SVDE. Section 4 evaluates the method on different indexes measures and classifiers. Section 5 concludes the work.

Complete Article List

Search this Journal:
Volume 12: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 4 Issues (2016)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing