Frequency Modulation Sound Parameter Identification using Shuffled Particle Swarm Optimization

Frequency Modulation Sound Parameter Identification using Shuffled Particle Swarm Optimization

Morteza Alinia Ahandan, Hosein Alavi-Rad, Nooreddin Jafari
Copyright: © 2013 |Pages: 10
DOI: 10.4018/ijaec.2013100104
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The frequency modulation sound parameter identification is a complex multimodal optimization problem. This problem is modeled in the form of a cost function that is the sum-squared error between the samples of estimated wave and the samples of real wave. In this research, the authors propose a shuffled particle swarm optimization algorithm to solve this problem. In the shuffled particle swam optimization proposed here, population such as shuffled frog leaping algorithm is divided to several memeplexes and each memeplex is improved by the particle swam optimization algorithm. A comparison among the obtained results of the authors' proposed algorithm with the results reported in the literature confirms a better performance of the authors' proposed algorithm.
Article Preview
Top

Introduction

The frequency modulation sound parameter identification (FMSPI) is a complex multimodal problem having strong epistasis. In order to solve this problem, it is firstly modeled in the form of an optimization problem and then an optimization method is employed to extract its parameters. The methods used for solving optimization problem can be divided into two main groups: analytical and non-analytical methods. Analytical optimization methods such as linear and non-linear programming require to apply some limitations and initial assumptions on objective function, e.g. to be differentiable and convex, so they cannot be applied on a variety of optimization problems. The non-analytical optimization methods do not have these limitations. They require little or no additional assumptions on the structure of optimization problems. One of the most applicable methods are evolutionary algorithms (EAs).

The EAs can be applied to search the complex, discrete or continuous, linear or non-linear, and convex or non-convex spaces. So those are recognized as all-propose and direct search optimization methods. Existence of a criterion to evaluate a candidate solution is unique necessary and enough limitation to apply the EAs on each problem. The particle swarm optimization (PSO) (Kennedy & Eberhart, 1995) is an EA which has been attracted many researchers in different engineering fields (Yu & Xiong, 2004; He & Wang, 2007; Kwok et al., 2006; Feng et al., 2007).

The PSO simulates the movement of organisms and social behaviour in a flock of birds or school of fishes. This algorithm updates its members using the particle’s previous best position and using the previous best position of the particle’s neighbors or whole swarm. To improve the performance of original PSO, various concepts have been proposed in the literature.

Feng et al. (2007) developed an evolutional fuzzy PSO learning algorithm to self extract the near optimum codebook of vector quantization for carrying on image compression. The fuzzy PSO vector quantization learning schemes, combined advantages of the adaptive fuzzy inference method, the simple VQ concept and the efficient PSO, were considered at the same time to automatically create near optimum codebook to achieve the application of image compression. Fan and Zahara (2007) proposed a hybrid NM-PSO algorithm based on the Nelder–Mead (NM) simplex search method and PSO to produce faster and more accurate convergence on unconstrained optimization. Lee and Chen (2007) presented an iteration PSO algorithm for solving the optimal contract capacities of a time-of-use (TOU) rates industrial customer. A new index, callediteration best was incorporated into PSO to improve solution quality and computation efficiency.

Arumugam and Rao (2007) embedded the concepts of the popular genetic algorithm (GA) operator, cross-over and root mean square variants into PSO algorithm to make the convergence faster. The experimental results illustrated the advantage of PSO with cross-over operator, which sharpened the convergence and tunes to the best solution.

Beside of aforementioned concepts, the authors propose a novel approach to improve the performance of PSO. This research employs the partitioning and shuffling concepts to give the parallel search ability to the PSO. The partitioning and shuffling concepts have been borrowed from the shuffled frog leaping (SFL) algorithm (Eusuff & Lansey, 2003).

The paper is organized as follows: the second section describes a formulation of the FMSPI problem. In the third section, the PSO is briefly described. The utilized strategy to improve the PSO is explained in the fourth section. The obtained computational results are presented and discussed in the fifth section. Conclusions are given in the sixth section.

Top

Frequency Modulation Sound Paprameter Identification

A global optimization problem can be considered as:ijaec.2013100104.m01, ijaec.2013100104.m02(1)

ijaec.2013100104.m03
where ijaec.2013100104.m04 is a set of variables and ijaec.2013100104.m05 is the objective function. A vector ijaec.2013100104.m06 that satisfies ijaec.2013100104.m07 for all ijaec.2013100104.m08 is a global minimizer of ijaec.2013100104.m09 over ijaec.2013100104.m10 and the corresponding value of ijaec.2013100104.m11 is called a global minimum.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 13: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing