The Emerging Computational Biolinguistic Framework

The Emerging Computational Biolinguistic Framework

Rodolfo A. Fiorini (Politecnico di Milano University, Milan, Italy)
DOI: 10.4018/IJCINI.2018100101


The convergence of software and intelligent sciences forms the transdisciplinary field of computational intelligence. Abstract intelligence is a human enquiry of both natural and artificial intelligence at the reductive embodying levels of neural, cognitive, functional, and logical from the bottom-up (BU). The human brain is at least a factor of 1 billion more efficient than our present digital technology, and a factor of 10 million more efficient than the best digital technology that we can imagine today. The unavoidable conclusion is that current neuromorphic engineering has something fundamental to learn from the human brain and cells about a new and much more effective form of computation, with a convenient, effective, efficient, and reliable BU approach. The author presents a brain-inspired geometric-logical scheme defining fundamental human linguistic and predicative competence. According to CICT, complete duality of opposition and implication geometry in logical geometry and language can model n-dimensional predicative competence and beyond, according to available computational resources.
Article Preview


American linguist, philosopher, cognitive scientist, logician, Avram Noam Chomsky's Theory of Syntax came after his criticism of probabilistic associative models of word order in sentences by Markov process approaches, in 1957 (Chomsky, 1957). As a matter of fact, since 1951, the inadequacy of probabilistic left-to-right (LTR) model (Markov process) (Durrett, 2010) had already been noticed by American psychologist and behaviorist Karl Lashley (Lashley, 1951), who anticipated Chomsky's arguments, by observing that probabilities between adjacent words in a sentence have little relation to grammaticality of the string. In the early 1960s, Chomsky and Marcel-Paul “Marco” Schützenberger started a new line of research by the different treatment of word, generator, relation and language (Chomsky, & Schützenberger, 1963). It is customary to refer to Noam Chomsky's “Aspects of the Theory of Syntax” as the founding document of Generative Grammar (Chomsky, 1965). It was the first attempt to present a general framework for the work in generative grammar that had been developing since the late 1940s, with applications in a number of languages (in rough chronological order, Hebrew, English, German, Russian, Hidatsa, Turkish, Mohawk, Japanese, and some others, at varying levels of depth). Chomsky presented a formal model of grammar comprising transformationally related levels of representation, fed by a lexicon. Some of the technical details, such as the use of features in subcategorization frames or the matching analysis of relativization, continue to figure prominently in the literature. Many others have been revised or replaced. For instance, in (Chomsky, 1965) there is a shift away from “Generalized Transformations” in favor of a recursive base component and Chomsky's preference for concatenation over set formation have seen dramatic reversals in recent work, Chomsky's own in particular (Berwick et al., 2011; Chomsky, 2013). Each language makes available an unbounded array of hierarchically structured expressions that have determinate interpretations at the interfaces with other internal systems: systems of thought and organization of action, the conceptual-intentional interface (CI), and the sensorymotor (SM) system for externalization (production and perception); usually sound, though as is now known, other modalities are possible. Chomsky calls this core feature of language its “Basic Principle”, BP for short. The BP comprehends the entire computational aspect of language, syntax in the broad sense, including the narrow syntax that provides the expressions mapped to the interfaces and the mapping themselves, and of course the lexical atoms of computation and their various configurations. Since then, there have, of course, been very substantial developments, specifically in cognitive computing and cognitive intelligence (Wang et al., 2006; Wang, 2012).

At that time, each individual language viewed from this perspective was called “a grammar,” in one of the uses of this systematically ambiguous expression (Fiorini, 2016a). Adopting a later terminological suggestion, the system is an I-language, where “I” signifies individual, internal, and intensional (in that we are interested in the actual generative procedure, not some set of entities that it determines: a set of utterances, structures, etc.). The theory of an I-language is a (generative) grammar. Languages can vary within the limits set by the genetic factors that specify the language faculty, called “universal grammar” (UG) in contemporary terminology. The general theory of language seeks to give an explicit account of UG, thus identifying the true nature of the BP.

As in every domain of science, we seek the simplest theory, the one with the most far-reaching capacity for explaining phenomena, rather than just describing them. From the earliest stages of acquisition, language resembles other biological systems in that what is attained is vastly underdetermined by the evidence available and mainly limited by the “poverty of stimulus” (POS) (Berwick et al., 2011) and incompleteness (Fiorini, 2017a).

The arithmetical competence that appears to be a common human possession provides triples (x, y, z) where z is the product of x and y, but without external aids (memory, time, attention, life span, etc.) multiplication of large numbers is “vanishingly rare.” The same is true of everyone's laptop. It is possible to provide it with a calculator, but without access to external memory, it will have bounded performance. Lacking any such access, the system has to be redesigned for larger calculations. With such access, it needs no redesign. That is the familiar distinction between strictly finite automata and Turing architecture.

Complete Article List

Search this Journal:
Open Access Articles: Forthcoming
Volume 13: 4 Issues (2019): Forthcoming, Available for Pre-Order
Volume 12: 4 Issues (2018)
Volume 11: 4 Issues (2017)
Volume 10: 4 Issues (2016)
Volume 9: 4 Issues (2015)
Volume 8: 4 Issues (2014)
Volume 7: 4 Issues (2013)
Volume 6: 4 Issues (2012)
Volume 5: 4 Issues (2011)
Volume 4: 4 Issues (2010)
Volume 3: 4 Issues (2009)
Volume 2: 4 Issues (2008)
Volume 1: 4 Issues (2007)
View Complete Journal Contents Listing