Article Preview
TopIntroduction
The analysis of the surrounding physical reality, for last at least three thousand years, as we know it, follow the line of building simplified models and solving problems using specific tools developed with applied approximations making them easy or relatively easy to maintain. The unquestioned successes of such, scientific, way of thinking allow us to create so-called civilization. However, this method, which we can call (Wang, 2007b) the Imperative Computing have limitations. Some problems can not be treated this way, at least at present. There is a belief that, e.g., mathematical tools needed to solve some problems in quantum field theory or hydro- or thermodynamic will be developed in the future. Some see a hope in ‘analog’ quantum computations. However, at the moment there are much common problems where the usual methods of 'standard analysis' sometimes fail (the general problem of `pattern' recognition is the perfect example). Quite recently the Cognitive Informatics (CI) theory shows other possible solution. It is the Autonomic Computations (AC). By definition (Wang, 2007b) the AC is not a passive system and uses among others the inference-driven mechanisms to get a (nondeterministic!) result. The realization is not obvious, but we propose that usage of untrained children brains just follows a general concept described in the Layered Reference Model of the Brain (Wang et al., 2006). The word ‘untrained’ here is important. The preliminary comparison with the ‘professional’ brain is discussed in the present work. The role of a prior knowledge is expected and seen, but it seems to be surprisingly small.
We would like to discuss here a particular problem of the describing of the data registered by some cosmic ray physics experimental device. The standard analysis involves extensive Monte Carlo studies (and there are still discrepancies between different groups of experimentalists and theoreticians). Situations reach the level described very well in the web ‘intro’ to IBM Autonomic Computing Manifesto (IBM, 2001): “Computing is too hard. It’s time we stop our preoccupation with faster and more powerful and start making them smarter.”
Our statement is that some extremely complex problems can be solved not only qualitatively but also quantitatively on the same level as this of the standard statistical method precision not only by ANN trained for this particular problem but also with the (over-sized, redundant) NNN using their ‘natural’ meta and higher cognitive functions acquired in the past, as a part of natural intelligent (NI) system category of conscious life functions known as the NI applications layers not obviously (obviously not) related to the particular problem.
Methods of the analyzing the NNN and ANN performance is shown and some first results are given in this paper. We would like to emphasize here that the present analysis is the interesting particular example of the domination of Artificial Intelligence by the Natural Intelligence (Wang, 2007a).