Nature Inspired Parallel Computing

Nature Inspired Parallel Computing

Dingju Zhu
DOI: 10.4018/978-1-60566-310-4.ch023
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Parallel computing is more and more important for science and engineering, but it is not used so widely as serial computing. People are used to serial computing and feel parallel computing too difficult to understand, design and use. In fact, they are most familiar with nature, in which all things exist and go on in parallel. If one learns parallel computing before learning serial computing, even if he or she has not read this chapter, they can find that serial computing is more difficult to understand, design and use than parallel computing, for it is not running in the way as the nature we are familiar with. Nature is composed of a large number of objects and events. Events are the spirit of objects; objects the body of events. They are related with each other in nature. Objects can construct or exist in parallel and events can occur or go on in parallel. The parallelism mainly exists in four dimensions including space dimension, application dimension, time dimension, and user dimension. After reading this chapter, even if you have been used to serial computing, you can find that the parallel computing used in your applications is just from nature. This chapter illustrates NIPC (Nature Inspired Parallel Computing) and its applications to help you grasp the methods of applying NIPC to your applications. The authors hope to help you understand and use parallel computing more easily and design and develop parallel software more effectively.
Chapter Preview
Top

Introduction

We illustrate nature inspired parallel computing in this chapter with eight paragraphs:

  • 1.

    Parallel Phenomenon in Nature

  • 2.

    Nature Inspired Parallel Computing

  • 3.

    Problems with Parallelisms

  • 4.

    Decomposition and Communication Schemes of NIPC

  • 5.

    Parallel Schemes of NIPC

  • 6.

    Distribution Schemes of NIPC

  • 7.

    Mapping Schemes of NIPC

  • 8.

    Architecture of NIPC

In the first paragraph, some phenomenon in nature is given to help readers to understand parallel computing; in the second paragraph, what is nature inspired parallel computing is discussed with readers; in the third paragraph, three typical problems can be solved faster and in larger scales via parallel computing are listed; in the fourth paragraph to the seventh paragraph, four schemes of NIPC including decomposition and communication schemes, parallel schemes, distribution schemes and mapping schemes are illustrated and applied to the three problems. At the last paragraph, the architecture of NIPC is given to integrate the four schemes and to provide readers an overall blueprint of NIPC.

Top

Background

Background of Parallel Computing

The term parallel computing refers to the use of (parallel) supercomputers and computer clusters (Wikipedia 2008; J. Dongarra, T. Sterling, H. Simon, & E. Strohmaier 2005).Parallel computing is a form of computing in which many instructions are carried out simultaneously (G.S. Almasi, & A. Gottlieb 1989). It operates on the principle that large problems can almost always be divided into smaller ones, which may be solved concurrently (“in parallel”). It has been used for many years, but interest in it has become greater in recent years due to physical constraints preventing frequency scaling (Wikipedia 2008). Parallel computing exists in several different forms: bit-level parallelism, instruction-level parallelism, data parallelism, and task parallelism. Parallelism is a primary method for accelerating the total power of a supercomputer. Computational physics applications have been the primary drivers in the development of parallel computing over the last 20 years. Languages (e.g., Fortran and C) and libraries (e.g., message passing interface (MPI) (M. Snir, S. Otto, S. Huss-Lederman, D. Walker, & J. Dongarra 1996) and linear algebra libraries, i.e., LAPACK (E. Anderson, Z. Bai, C. Bischof, S. Blackford, J. Demmel, J. Dongarra,J. Du Croz, A. Greenbaum, S. Hammaring, A. McKenney, & D. Sorensen 1999)) allow the programmer to access or expose parallelism in a variety of standard ways (J. Dongarra 2006).

Key Terms in this Chapter

Digital City: The concept “digital city” is derived from the concept “digital earth” brought forward firstly by Al Gore . Digital city simulate real cities by computer science and geographic science to help government for city planning and transportation simulation. There are some famous digital cities including Virtual Los Angeles, Model City Philadelphia, Google earth and virtual earth .

In the time dimension: different processes of objects and events in different time in the time dimension are distributed onto different processes, and run in parallel

In the application dimension: different processes of objects and events in different applications in the application dimension are distributed onto different processes, and run in parallel

Application Dimension of NIPC: In application dimension, objects in different applications can construct or exist in parallel and all events in different applications can occur or go on in parallel. For example, underground objects and events can be processed by mine monitoring application, underwater monitoring application and other underground applications in parallel; objects and events on the earth surface can be processed by city planning application, ecological resource monitoring application, tour application and other earth surface applications in parallel; objects and events in the sky can be processed by climate monitoring application, air pollution monitoring application and other sky applications in parallel.

Time Dimension of NIPC: In time dimension, objects can construct or exist in parallel in different time, and all events can occur or go on in parallel in different time. For example, a boy sat on a chair previously, and his father sits on the same chair at present.

In the space dimension: processes of objects and events in different spaces in the space dimension are distributed onto processes, and run in parallel

Space Dimension of NIPC: In space dimension, objects at different locations can construct or exist in parallel, and all events at different locations can occur or go on in parallel. For an example in vertical direction, one house is repaired in Nanshan, Shenzhen China, and at the same time another house is repaired in FUtian, Shenzhen China. Another example, men are walking on roads, and at the same time buses are running on roads. For an example in vertical direction, planes are flying in the sky, at the same time cloud drifting in the sky, at the same time rain is dropping onto the earth surface, at the same time water is flowing on the earth surface, and at the same time groundwater is flowing in the earth.

Nature Inspired Parallel Computing(NIPC): Use the parallel method in nature to solve computing problems. All things in nature can be simulated using computer, and the parallel phenomenon in nature can be simulated by parallel computing technology using HPC (high performance computer), which consists of many computing nodes with CPUs and kernels, different from traditional computer with single computing node. Things in nature can be simulated in parallel on different computing nodes or different CPUs or different CPU kernels in HPC. Nature is composed of a large number of objects and events. Events are the spirit of objects; objects the body of events. They are related with each other in nature. Objects can construct or exist in parallel and events can occur or go on in parallel. The parallelism mainly exists in four dimensions including space dimension, time dimension, application dimension, and user dimension. We can design and use NIPC according to the four dimensions.

User Dimension of NIPC: In user dimension, objects and events can be used or watched by users in parallel. For example, Jack watches buildings from one angle, and at the same time John watches buildings from another angle.

Parallel Computing: The term parallel computing refers to the use of (parallel) supercomputers and computer clusters. Parallel computing is a form of computing in which many instructions are carried out simultaneously. It operates on the principle that large problems can almost always be divided into smaller ones, which may be solved concurrently (“in parallel”). It has been used for many years, but interest in it has become greater in recent years due to physical constraints preventing frequency scaling. Parallel computing exists in several different forms, bit-level parallelism, instruction-level parallelism, data parallelism, and task parallelism. Parallelism is a primary method for accelerating the total power of a supercomputer. Computational physics applications have been the primary drivers in the development of parallel computing over the last 20 years. Languages (e.g., Fortran and C) and libraries (e.g., message passing interface (MPI) and linear algebra libraries, i.e., LAPACK) allow the programmer to access or expose parallelism in a variety of standard ways.

In the user dimension: processes of objects and events in different user services are distributed onto different grid nodes, and run in parallel

Complete Chapter List

Search this Book:
Reset