The 20th century saw the beginning of the evolution of learning machines from the growth of Boolean computers into Bayesian inference machines (Knuth, 2003). For some this is the crux of Artificial Intelligence (AI); however, AI research generally has yielded a plethora of specifically engineered, but formally unrelated, theories/models with varied levels of applications successes/failures, but without a commonly-explicatable conceptual foundation (i.e., it has left a theory-glut). Despite these many approaches to AI, including Automated Neural Nets, Natural Language Processing, Genetic Algorithms, Fuzzy Logic and Fractal Mathematical computational approaches, to identify only a few, AI itself has remained an elusive goal to achieve by means of a systems architecture relying on an implementation based on the systemic computer paradigm. The 21st century experience is overwhelmingly one of an ever-accelerating, dynamically changing world. Just staying in place seems nearly impossible—getting ahead is becoming increasing unfathomable in a world now characterized by an evolving dominance of Information Science and Technology Development in exponentially tighter (shorter) innovation cycles (IBM, 2008). In business, for example, there is the continuous challenge to ensure that the business’s products appear obviously differentiated from the competition, while staying current with the never-ending hot new trends that buffet the industry. A prime case in point is that of staying current with the trends in the computer solutions industry since adapting a computer dependent business (and most are) for the next big trend can be expected to be mitigated, if not made completely obsolete, by the next next big trend already on the radar screen.
It is becoming increasingly evident to a growing number of key decision makers that innovation development and management demands a technological assist (Roco & Bainbridge, 2002). This technology, however, must dramatically Augment Human Intelligence in the near future while moving toward a General Autonomous Artificial Intelligence in the longer term (Singularity Institute for Artificial Intelligence, Inc., 2001). Despite the recognition that meeting the demands of accelerating innovation is only likely through advancing AI, which in turn has the potential to impact every aspect of human life, the problem/dilemma for AI developers is that there is no standard theory of mind.
To further accentuate this circumstance, the networking of computers has in turn led to the Web with essentially an unlimited growth of data/information (i.e., an info-glut). The industry’s response, however, to the info-glut problem, has been an ever-growing abundance of Web-access tools, which to an average user seem ironically as only another “glut” (a technology-glut or tool-glut).
Proposed theories of the Web, like with AI, are also numerous and without a common foundation on which to build a mutual understanding of AI and the Web. There are also a plethora of heuristic technological approaches to AI and the Web ranging from IntelligizingTM the Web through Learning/Thinking Webs to the Web as a Global (Super) Brain and Virtual Reality as Social Superorganism [See for instance these topics at Principia Cybernetica Web (2008)]. Basically, however, research on AI and the Web is categorizable as to whether the focus is on the preeminence of brain vs. mind (Roco & Bainbridge, 2002), as for the Human Cognome Project keyed to reverse engineering the human brain, or mind vs. brain, via a modular description of a general intelligence capable of open-ended recursive self-enhancement (Singularity Institute for Artificial Intelligence (2001), General Intelligence and Seed AI) or, alternatively, on the co-evolution of mind & brain, characterized by Project AutoGnomeTM/CoGnomeTM/ CogWebTM, this being the approach of Ai3inc.
Key Terms in this Chapter
Algebra of Probable Inference/Inquiry: The Algebra of Probable Inference/Inquiry is a common sense foundational reformation of the concepts of Probability, by the simple generalization of implication among logical statements in the Boolean algebra to degrees of implication , and of Entropy, by generalizing a particular function of the question lattice to a valuation called relevance which is a measure of the degree to which a statement answers a given question. This effectively establishes probability theory as logic.
That is to say: experience is all at once partially ordered, partially chaotic and partially organizable
Semiotic Relational Systems: A Semiotic Relational System is a system of relations exhaustively admitting all forms of interrelatedness among systems and/or relations and with certain systems or relations taking the place of (i.e., imaging (signifying)) other systems or relations.
AutoGnome: The AutoGnome is a self-knowing general purpose software system of automated (autonomous) inquiry, inference and intuition exploiting a mechanized carrier system for relational semiosis as a virtual (synthetic) mind.
CogWeb: The CogWeb is the Network of Intellisites implemening the CoGnome for Network Decision-Making by autonomously formed Intellisite-defined groups organizations, communities, and societies.
DisOrdered (indeterminate or uncertain) experience: a theory of probable inference/inquiry
Order/DisOrder/ReOrder Form: It is a tenet of Relational Systems that any semiotic act must, of necessity, express the Form of experience as the inseparable conjunction of:
CoGnome: The CoGnome is a selected WebGnome which, inter-connecting two or more Intellisites in a Network of Intellisites, provides a computerized collective intelligence, an automated cointelligence, that is, the Collective-AutoGnome (Auto(Co)Gnome) or simply the CoGnome.
Boundary Mathematics: Boundary Mathematics is a semiotic formalism generated by creating a distinction (a boundary) in nonexistence (of system) thus resulting in a first system. Extended to multiboundaries with a common sense reiterative reduction rule leading either to one distinction or nonexistence, this mathematical form and process is the germ of an approach to the formulation of a universal language of mathematics.
ReOrdering DisOrdered experience: via a generalized probabilistic optimization principal
Ordered (i.e., determined or certain) experience: a formal algebra/logic of semiosis
Maximum Entropy (MaxEnt) Principle: MaxEnt is a technique for automatically acquiring probabilistic knowledge from incomplete information without making any unsubstantiated assumptions. Entropy is a mathematical measure of uncertainty or ignorance: greater entropy corresponds to greater ignorance. Hence, the MaxEnt solution is the least biased possible solution given whatever is experimentally known, but assuming nothing else.
Intellisite: The Intellisite (an Intelligent Website) is a constructed software environment (a Website) with an embedded form of the AutoGnome known as a WebGnome, an intelligent agent residing in this cyberspace environment which, with its continuous adaptive learning from mimicking the user’s behavior, will grow into a likeminded replica ( Mind Clone) of a user- self acting in the Virtual Reality of the Internet with the synthetic mind capabilities of the AutoGnome.