Use of Chaotic Randomness Numbers: Metaheuristic and Artificial Intelligence Algorithms

Use of Chaotic Randomness Numbers: Metaheuristic and Artificial Intelligence Algorithms

Alper Ozpinar, Emel Seyma Kucukasci
Copyright: © 2016 |Pages: 21
DOI: 10.4018/978-1-5225-0075-9.ch010
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The timeless search for optimizing the demand and supply of any resource is one of the main issues for humanity nearly from the beginning of time. The relevant cost of adding an extra resource reacts by means of more energy requirement, more emissions, interaction with policies and market status makes is even more complicated. Optimization of demand and supply is the key to successfully solve the problem. There are various optimization algorithms in the literature and most of them uses various algorithms of iteration and some degree of randomness to find the optimum solution. Most of the metaheuristic and artificial intelligence algorithms require the randomness where to make a new decision to go forward. So this chapter is about the possible use of chaotic random numbers in the metaheuristic and artificial intelligence algorithms that requires random numbers. The authors only provide the necessary information about the algorithms instead of providing full detailed explanation of the subjects assuming the readers already have theoretical basic information.
Chapter Preview
Top

Introduction

Most recent optimization problems of recent years is not only limited to balancing the classical supply and demand problems as well searching for the common optimization problems related with the real-world challenges. Those challenges can be listed as sustainable energy management, logistics, transportation, production, manufacturing, consumption, healthcare, education, financial, telecommunication, cloud computing, smart grids, internet of things and even genetic research. Most of these studies can be solved by using combinatorial optimization problems (COPs) and most of them are classified as NP-hard type problems. These real-life COPs are frequently characterized by their large-scale sizes and the need for obtaining high-quality solutions in short computing times, thus they require the use of metaheuristic and also artificial intelligence algorithms (Juan, Faulin, Grasman, Rabe, & Figueira, 2015).

In practice initializations of the stochastic and combinatorial optimization algorithms are based on the random number sequences and feeds. The modifications on the initialization directly affect the later phases and the success of the applied metaheuristics algorithm. Moreover, some of the algorithms do selections in regard to random numbers to search the solution space during the iterations. The performances of the evolutionary algorithms are asserted to be improved by using qualified random number generators (Bastos-Filho, Oliveira, Nascimento, & Ramos, 2010; Caponetto, Fortuna, Fazzino, & Xibilia, 2003).

In order to solve the complex and big domain complex problems it’s a preferred way to solve them in distributed and parallel within high performance computing systems and supercomputers. The main problem steps distributed among those computing machines which are the members of the distributed system. The term computing machine refers either a single computer with multiple core CPU’s, a high performance computing cluster, a supercomputer or may be a distributed grid structure formed by many computing nodes. A good distributed computing structure requires the higher level of transparency. In general transparency refers to operation which does not requires a common operation set, synchronized time settings between nodes, share of memory and variables. Many multi-core processors are homogeneous hardware architectures by means of shared-memory and direct memory access properties, meaning that all cores are identical however heterogeneous configurations of multi-core systems also exist in various computing centers (DAngelo & Marzolla, 2014). The required transparency within the high performance computing and cloud computing applications, requires the ideal case that is all the computing inventory acting like a single system even it does not.

A common problem here is increasing the number of devices results in the increasing the coincidences related with the random computing. Due to the logic and circuit system foundations the outcome of the computers is always acting in the same way unless affected by some an external feedback or source. They can create a random number by using an algorithm and mostly by implementing a deterministic function. Without a dynamic input or bias this double team never makes a surprise to its user. For example in most of the programming courses a lecture is generally includes some practical daily examples about generating random numbers with computers for lottery or card games. A simple example for shooting n random numbers with the simplest code in the computer laboratory, all the students in the lab will get the same lucky numbers in the same order. Some programming languages like c, a seeding number can be used to change the process of random number generation however this time the students using the same number for seeding sequence also gets the same numbers.

Using current time as a continuous changing and non-repeating type of seeds also does not solve the uniqueness problem of random numbers. This is the sensitive and crucial point about solving the problems in parallel and distributed. Considering the top list of super computers with more than million cores the probability of coincidence for obtaining same random numbers increase with the decrease of the quality of the random numbers. As a result either computational performance is decreased or limited search domain obtained for solution progress, which is not desired. In conclusion a good quality random number generation progress is required. The following sections explain this idea in detail with definitions and applications.

Complete Chapter List

Search this Book:
Reset