When looking for a solution, deterministic methods have the enormous advantage that they do find global optima. Unfortunately, they are very CPU intensive, and are useless on untractable NP-hard problems that would require thousands of years for cutting-edge computers to explore. In order to get a result, one needs to revert to stochastic algorithms that sample the search space without exploring it thoroughly. Such algorithms can find very good results, without any guarantee that the global optimum has been reached; but there is often no other choice than using them. This chapter is a short introduction to the main methods used in stochastic optimization.
Key Terms in this Chapter
Local Search: Local search methods are usually deterministic. They can find a local optimum very quickly, but will get stuck there.
Emergent Behavior: A behavior that was not directly hardcoded into an algorithm.
Heuristic: Non-exact method used to find a good solution, usually based on trial and error.
Unimodal: A problem with a single local optimum (that is de facto also the global optimum). Local search algorithms will find the global optimum of a unimodal problem.
Fitness Landscape: If one plots the fitness of all possible solutions to a problem, one obtains a “landscape” that an optimization algorithm will explore, usually to find a global optimum.
Stigmergy: Basic inter-individual communication based on information left in the environment for others to use.
Multimodal: A problem with several local optima. Local search algorithms will get trapped in the first local optimum they find. If one is unlucky, the found local optimum is not the global optimum. Local search algorithms cannot be used on multimodal problems.
Global Search: Global search methods will implement ways to avoid getting trapped in local optima.