Natural systems provide unique examples of computation in a form very different from contemporary computer architectures. Biology also demonstrates capabilities such as adaptation, self-repair, and self-organisation that are becoming increasingly desirable for our technology. To address these issues a computer model and architecture with natural characteristics is presented. Systemic computation is Turing Complete; it is designed to support biological algorithms such as neural networks, evolutionary algorithms and models of development, and shares the desirable capabilities of biology not found in conventional architectures. In this chapter the authors describe the first platform implementing such computation, including programming language, compiler and virtual machine. They first demonstrate that systemic computing is crash-proof and can recover from severe damage. The authors then illustrate various benefits of systemic computing through several implementations of bio-inspired algorithms: a self-adaptive genetic algorithm, a bio-inspired model of artificial neural networks, and finally we create an “artificial organism” - a program with metabolism that eats data, expels waste, clusters cells based on data inputs and emits danger signals for a potential artificial immune system. Research on systemic computation is still ongoing, but the research presented in this chapter shows that computers that process information according to this bio-inspired paradigm have many of the features of natural systems that we desire.
TopIntroduction
Does a biological brain compute? Can a real ant colony solve a travelling salesman problem? Does a human immune system do anomaly detection? Can natural evolution optimise chunks of DNA in order to make an organism better suited to its environment?
The intuitive answer to these questions is increasingly: we think so. Indeed, researchers are so impressed by the capabilities of nature that biological systems have become highly significant to computer science as examples of highly complex self-organising systems that perform tasks in parallel with no centralised method of control and show homeostatic behaviour. For example, in nature, old and potentially damaged cells are constantly being replaced and DNA repaired (Darnell, 1990). The lifespan of cells is shorter than the life of an organism, so fault-tolerance and self-maintenance are essential for the survival of the organism. The failure of some components does not destroy the overall organism; cell death is an important part of staying alive.
Features such as self-organisation, fault-tolerance or self-repair, found in natural computation, would be of great interest for our technologies. Today, software regularly crashes, top of the line robots break down on the wrong kind of ground, power distribution networks fail under unforeseen circumstances (Bentley, 2007a). With the increasing performance, potential and complexity in machines and software, it has become increasingly difficult to ensure reliability in systems.
But how can useful biological features be achieved in computers? While the theory of computation is well understood through the concept of the Universal Turing Machine (UTM) (Turing, 1936), practical issues of architecture remain problematical for computer science and computer-based technologies. The apparent dichotomy between systems of “natural computation” such as the brain, and computer systems based on classical designs shows that even though the two systems of computation might be mathematically equivalent at a certain level of abstraction, they are practically so dissimilar that they become incompatible.
We can state that natural computation is stochastic, asynchronous, parallel, homeostatic, continuous, robust, fault tolerant, autonomous, open-ended, distributed, approximate, embodied, has circular causality, and is complex. The traditional von Neumann architecture is deterministic, synchronous, serial, heterostatic, batch, brittle, fault intolerant, human-reliant, limited, centralised, precise, isolated, uses linear causality and is simple. The incompatibilities are clear.
Just as the development of Prolog enabled elegant and precise implementations of logical expressions, so the development of a paradigm where systems could be defined in a manner that resembles their true structures would improve our ability to implement bio-inspired systems.
To address these issues, (Bentley, 2007b) introduced Systemic Computation (SC), a new model of computation and corresponding computer architecture based on a systemics world-view and supplemented by the incorporation of natural characteristics (listed above). Such characteristics are not natively present in current conventional paradigms and models of natural processes that run on conventional computers must simulate these features. This often leads to slower and less straightforward implementations compared to analytical or linear algorithms for which computers are well suited. Also in contrast, systemic computation stresses the importance of structure and interaction, supplementing traditional reductionist analysis with the recognition that circular causality, embodiment in environments and emergence of hierarchical organisations all play vital roles in natural systems.
In this chapter we present the first platform implementing systemic computation, including programming language, compiler and virtual machine. Using this platform we first show by implementing a genetic algorithm how systemic computing enables fault-tolerance and easily integrated self-repair, fundamental properties of natural computing and highly desirable features in modern computational systems. Then, to demonstrate further benefits of SC programming, we provide several implementations of bio-inspired algorithms: genetic algorithms, artificial neural networks and artificial immune systems. These illustrate how SC enables ease, clarity and fidelity in the modelling of bio-inspired systems, but also respectively illustrate advanced and desirable features provided natively by SC.