Multi-Objective Binary Fish School Search

Multi-Objective Binary Fish School Search

Mariana Gomes da Motta Macedo (University of Pernambuco, Brazil), Carmelo J. A. Bastos-Filho (University of Pernambuco, Brazil), Susana M. Vieira (University of Lisboa, Portugal), and João M. C. Sousa (University of Lisboa, Portugal)
Copyright: © 2018 |Pages: 20
DOI: 10.4018/978-1-5225-5134-8.ch003
OnDemand PDF Download:
No Current Special Offers


Fish school search (FSS) algorithm has inspired several adaptations for multi-objective problems or binary optimization. However, there is no particular proposition to solve both problems simultaneously. The proposed multi-objective approach binary fish school search (MOBFSS) aims to solve optimization problems with two or three conflicting objective functions with binary decision input variables. MOBFSS is based on the dominance concept used in the multi-objective fish school search (MOFSS) and the threshold technique deployed in the binary fish school search (BFSS). Additionally, the authors evaluate the proposal for feature selection for classification in well-known datasets. Moreover, the authors compare the performance of the proposal with a state-of-art algorithm called BMOPSO-CDR. MOBFSS presents better results than BMOPSO-CDR, especially for datasets with higher complexity.
Chapter Preview


Binary optimization is a class of problems in which the input variables to be determined are binary, which means that the variables just can assume two states. This type of approach is applied to feature selection problems. In the feature selection problem, one has to define which features must be chosen for a particular task. Feature selection can be used to identify which features are relevant for a specific task, such as for classification, regression or forecasting.

Consider a classification problem as an example. In general, one has several possible features as inputs to find the best class of each instance, and it is useful for all types of knowledge, such as medical diagnosis (Shenfield & Rostami, 2015), text classification (Forman, 2003) and fingerprint segmentation (Sankaran et al., 2017). One important aspect of the classification task is the amount of deployed features to guarantee an accurate result, which for real problems can be high. As a consequence of using many input features, the classifier may need a high processing time. Another issue is that a large number of features can obscure important relationships between instances because of the irrelevant or redundant features. Thus, feature selection is a relevant problem for an efficient classification procedure (Guyon & Elisseeff, 2003). Many approaches for feature selection using swarm-based approaches and evolutionary computation have been proposed recently (Xue, 2013a) (Xue, 2013b) (Xue, 2014)(Xue, 2015)(Xue, 2016). In the literature, one can find many classification methods. Support Vector Machine (SVM) (Hearst et al., 1998) is a widely used and well-known technique for classification tasks. Although SVM is not the fastest one, it is considered robust. Since the goal of the chapter is not to propose a classification method, but it is to introduce a technique for feature selection, the authors use SVM in all the experiments for the validation of our proposal.

On the other hand, from the optimization perspective, the authors highlight that most of the real-world engineering problems present more than one optimization targets. This last affirmative means that one may need to optimize simultaneous fitness functions that are conflicting among them. As an example, suppose that one must design a classifier that uses a low number of features and presents a small error in the classification process. One can observe that if the authors diminish the number of required features below an unknown threshold, the accuracy of the classifier can be worsened. This kind of conflict constitutes a typical multi-objective optimization problem. In this chapter, the authors use this optimization problem for validating our proposal and for the sake of comparison with other state-of-the-art algorithms.

Complete Chapter List

Search this Book: