Numerous attempts are being made to develop machines that could act not only autonomously, but also in an increasingly intelligent and cognitive manner. Such cognitive machines ought to be aware of their environments which include not only other machines, but also human beings. Such machines ought to understand the meaning of information in more human-like ways by grounding knowledge in the physical world and in the machines’ own goals. The motivation for developing such machines ranges from self-evidenced practical reasons, such as the expense of computer maintenance, to wearable computing in health care, and gaining a better understanding of the cognitive capabilities of the human brain. To achieve such an ambitious goal requires solutions to many problems, ranging from human perception, attention, concept creation, cognition, consciousness, executive processes guided by emotions and value, and symbiotic conversational human-machine interactions. An important component of this cognitive machine research includes multiscale measures and analysis. This chapter presents definitions of cognitive machines, representations of processes, as well as their measurements, measures and analysis. It provides examples from current research, including cognitive radio, cognitive radar, and cognitive monitors.
Computer science and computer engineering have contributed to many shifts in technological and computing paradigms. For example, we have seen shifts (i) from large batch computers to personal and embedded real-time computers, (ii) from control-driven microprocessors to data- and demand-driven processors, (iii) from uniprocessors to multiple-processors (loosely coupled) and multiprocessors (tightly coupled), (iv) from data-path processors to structural processors (e.g., neural networks (Bishop, 1995)), quantum processors (Nielsen and Chuang, 2000) and biomolecular processors (Sanz et al., 2003), (v) from silicon-based processors to biochips (Ruaro et al., 2005), (vi) from vacuum tubes to transistors to microelectronics to nanotechnology, (vii) from large passive sensors to very small smart active sensors (Soloman, 1999), (viii) from local computing to distributed computing and network-wide computing, (vii) from traditional videoconferencing to telepresence (e.g., WearTel and EyeTap (Mann, 2002)), (viii) from machines that require attention (like a palmtop or a wristwatch computer) to those that have a constant online connectivity that drops below the conscious level of awareness of users (like autonomic computers (Ganek and Corbi, 2003), (IBM 2006), and eyeglass-based systems (Mann, 2002), (Haykin and Kosko, 2001)), (ix) from crisp-logic-based computers to fuzzy or neurofuzzy computers (Pedrycz and Gomide, 1998), as well as (x) from control-driven (imperative) systems to cognitive systems such as cognitive radio (Haykin, 2005a), cognitive radar (Haykin, 2006), active audition (Haykin and Chen, 2005), and cognitive robots. These remarkable shifts have been necessitated by the system complexity which now exceeds our ability to maintain them (Ganek and Corbi, 2003), while being facilitated by new developments in technology, intelligent signal processing, and machine learning (Haykin et al., 2006).