Searching for Self-Replicating Systems

Searching for Self-Replicating Systems

DOI: 10.4018/978-1-61520-787-9.ch007
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In Complexity Science (Bak, 1996; Morin, 2001; Gell-Mann, 1994; Prigogine & Stengers, 1984) and Artificial Life (Langton, 1995; Adami, 1998), almost all attempts to simulate or synthesize living systems in new media are somehow related to the influential work of John von Neumann (1966).These studies can be grouped into four basic categories (Sipper et al., 1998).
Chapter Preview
Top

Introduction

In Complexity Science (Bak, 1996; Morin, 2001; Gell-Mann, 1994; Prigogine & Stengers, 1984) and Artificial Life (Langton, 1995; Adami, 1998), almost all attempts to simulate or synthesize living systems in new media are somehow related to the influential work of John von Neumann (1966).These studies can be grouped into four basic categories (Sipper et al., 1998):

  • 1.

    Attempts to build universal constructors based on von Neumann’s self-replicating automaton. This work mainly dates to the 1950s and 1960s. (von Neumann 1966; Codd 1968; Vitànyi, 1973);

  • 2.

    Attempts to create a minimal system capable of non-trivial self replication. This line of study began with Langton (1984) with later contributions from (Byl 1989; Reggia et al.1993; Morita and Imai 1996).

  • 3.

    The enhancement of self-replicators with additional computational capabilities. This was a key topic of research in the 1990s (Tempesti 1995; Perrier, Sipper and Zahnd 1996; Chou and Reggia 1998);

  • 4.

    Implementation of evolving, emergent self-replicators. This work began in the 1990s and is still in progress (Lohn and Reggia 1995 and 1997; Chou and Reggia 1997; Sayama 1998; 2000; 2004; 2006).

The basic idea underlying all this work is that an artificial self-replicating system can be described by a logical sequence of steps or an algorithm. Until a few years ago, we were aware of only a relatively small number of such systems. Today, however, many researchers have developed highly effective methods for generating these systems. One of these methods is to use Genetic Algorithms (Mitchell et al., 1994; Mitchell, 1996). Particularly important is Wuensche’s method (1999), which measures order and chaos in CA rules by measuring Shannon Entropy in the frequency of use of specific elementary rules.

The values for this distribution vary over time, following the evolutionary process. Wuensche identified characteristic features in the dynamics of his indicator and used them as the basis for an empirical classification of CA behavior. Ordered systems converge rapidly to equilibrium, after which they generate no further structural change apart from short period oscillations. As a rule, they use relatively few rules, some frequently and others very rarely. In chaotic systems,, the frequency distribution is uniform. These systems have no preference for specific rules or for rules with a special ability to process information. Finally, complex systems use just a few rules, characterized by a high variance in Input Entropy. Frequencies of use of other rules are uniform.

Complete Chapter List

Search this Book:
Reset