Realizing the Need for Intelligent Optimization Tool

Realizing the Need for Intelligent Optimization Tool

Dilip Kumar Pratihar
DOI: 10.4018/978-1-5225-0058-2.ch001
(Individual Chapters)
No Current Special Offers


A large number of traditional optimization tools are available in the literature, as each of these techniques is suitable to solve a particular problem. Realizing this fact, non-traditional optimization tools have been proposed, which are supposed to be robust enough to solve a variety of problems. Moreover, these tools should be able to reach the optimal solutions quickly and as accurately as possible. The family of non-traditional optimization tools has become bigger, nowadays, which contradicts the very purpose of developing non-traditional optimization tool. In this write-up, the reasons behind this fact have been discussed in detail, and the need for an intelligent optimization tool has been felt, which is supposed to be problem-independent.
Chapter Preview

Traditional Tools For Optimization

There exist various traditional (also known as conventional) tools for optimization, and these are broadly classified into two groups, namely direct search methods and gradient-based search methods. The search is guided by the value of objective function in a direct search method, whereas the search direction is decided by the gradient information of objective function in the gradient-based methods. A direct search method may take a number of iterations to reach the optimized solution. On the other hand, a gradient-based method can yield the optimized solution through a comparatively less number of iterations. However, the chance of its solution for getting stuck at the local minimum is more, as gradient is a local property of the surface of objective function. Interested readers may refer to Rao (1978), Deb (1995) for the detailed description of these traditional tools for optimization. Traditional tools for optimization have the following demerits:

  • A particular traditional tool may be required to solve a specific problem more efficiently. Therefore, traditional tools may not be robust enough to solve a variety of problems. For example, in order to solve optimization problem involving integer variables, a special type of tool called integer programming algorithm is to be used.

  • Gradient-based optimization tools cannot be used to solve the problems involving discontinuous objective functions.

  • The possibility of the solutions of a gradient-based method to reach the local minima is more due to the reason mentioned above.

  • Direct search methods may unnecessarily consume more time to search and find the optimized solution.

  • As the traditional optimization tools start with a single initial solution selected at random, they cannot be used for parallel computing.


Non-Traditional Tools For Optimization

To overcome the above drawbacks of the traditional tools, a number of non-traditional (also called un-conventional) optimization tools had been proposed. By copying the mechanisms of biological adaptation and evolution, various tools like Genetic Algorithms (GAs) (Holland, 1975), Genetic Programming (GP) (Koza, 1992), Evolution Strategies (ES) (Rechenberg, 1973), Evolutionary Programming (EP) (Fogel, Owens, & Walsh, 1966), and others, were developed.

Key Terms in this Chapter

Optimization: The process of finding the best one out of all feasible solutions.

Intelligent Optimization Tool: An optimization tool, which is able to adjust its parameters in an adaptive way, so that its performance becomes problem-independent.

Genetic Algorithm: It is a population-based search and optimization tool that works based on Darwin’s principle of natural selection.

Accuracy: It is defined as the deviation of calculated value from its target value.

Robustness: It is a property due to which, the tool will be able to solve a variety of problems.

Particle Swarm Optimization: It is a population-based evolutionary computation technique that works according to the principles of bird flocking, fish schooling.

Search Speed: It is the speed at which the algorithm moves towards the optimal solution iteratively.

Simulated Annealing: It is an optimization algorithm developed by artificially modeling the cooling process of molten metal.

Complete Chapter List

Search this Book: