Receive a 20% Discount on All Purchases Directly Through IGI Global's Online Bookstore.

Additionally, libraries can receive an extra 5% discount. Learn More

Additionally, libraries can receive an extra 5% discount. Learn More

Xin-She Yang (School of Science and Technology, Middlesex University, UK) and Suash Deb (Cambridge Institute of Technology, India)

Copyright: © 2015
|Pages: 10

DOI: 10.4018/978-1-4666-5888-2.ch014

Top## Introduction

* (1)* subject to the constraints* (2)** (3)* where and are in general nonlinear functions. Here the design vectorcan be continuous, discrete or mixed in a *d*-dimensional space. The functions are called objectives or cost functions, and when , the optimization problem is multiobjective or multicriteria (Yang, 2008, 2010a; Yang, 2010b). It is possible to combine different objectives into a single objective, though multiobjective optimization can give far more information and options to the decision-makers with more insight into the problem. It is worth pointing out that here we write the problem as a minimization problem, it can also be written as a maximization problem by simply replacing by .

In almost all applications in engineering and industry, we are always trying to optimize something -- whether to minimize the cost and energy consumption, or to maximize the profit, output, performance and efficiency (Yang, 2010b; Yang & Koziel, 2011). The optimal use of available resources of any sort requires a paradigm shift in scientific thinking, this is because most real-world applications have far more complicated factors and parameters to affect how the system behaves.

Optimization algorithms are the tools and techniques of achieving the optimality of the problem of interest. This search for optimality is complicated further by the fact that uncertainty almost always presents in the real-world systems. Therefore, we seek not only the optimal design but also robust design in engineering and industry. Optimal design solutions, which are not robust enough, are not practical in reality. Suboptimal solutions or good robust solutions are often the choice in such cases because they are more robust and less sensitive to the uncertainty in the material properties in the real systems.

Optimization problems can be formulated in many ways. For example, the commonly used method of least-squares is a special case of maximum-likelihood formulations. By far the most widely formulation is to write a nonlinear optimization problem as

When all functions are nonlinear, we are dealing with nonlinear constrained problems. In some special cases when are linear, the problem becomes linear, and we can use the widely used linear programming techniques such as the simplex method. When some design variables can only take discrete values (often integers), while other variables are real continuous, the problem is of mixed type, which is often difficult to solve, especially for large-scale optimization problems. A very special class of optimization is the convex optimization, which has guaranteed global optimality. Any optimal solution is also the global optimum, and most importantly, there are efficient algorithms of polynomial time to solve such problems.

Optimization: A problem or solution procedure which aims to find the optimal solutions to the objective function or functions under constraints.

Lévy Flight: A random process or random walk whose step size distribution obeys the power-law Lévy distribution. Some species of birds and fruitflies seem to obey this behavior in their flight paths.

NP-Hard: A problem is called hard if the solution time is an exponential function of its problem size. NP-hard means that non-deterministic polynomial time (NP) hard, which has no efficient algorithms to find its solutions.

Algorithm Complexity: Also known as the time complexity of an algorithm. It is the number of steps needed to complete the execution of an algorithm, and it is a measure of how efficient an algorithm is.

Metaheuristic: A class of stochastic algorithms which combines deterministic and stochastic components. They are often developed by drawing inspiration from Nature.

Cuckoo Search: An optimization strategy, developed by Xin-She Yang and Suash Deb in 2009, based on the breeding behaviour of some cuckoo species.

Evolutionary Algorithms: A subset of optimization algorithms in evolutionary computation, mainly uses genetic operators such as crossover, mutation and selection. Genetic algorithms and evolutionary programming are good examples.

Algorithm: A step-by-step precedure for the instructions of computation.

Search this Book:

Reset

Copyright © 1988-2019, IGI Global - All Rights Reserved