This chapter presents macroscopic properties of higher order neural networks. Randomly connected Neural Networks (RNNs) are known as a convenient model to investigate the macroscopic properties of neural networks. They are investigated by using the statistical method of neuro-dynamics. By applying the approach to higher order neural networks, macroscopic properties of them are made clear. The approach establishes: (a) there are differences between stability of RNNs and Randomly connected Higher Order Neural Networks (RHONNs) in the cases of the digital state -model and the analog state model; (b) there is no difference between stability of RNNs and RHONNs in the cases of the digital state -model and the analog state -model; (c) with neural networks with oscillation, there are large differences between RNNs and RHONNs in the cases of the digital state -model and the analog state -model, that is, there exists complex dynamics in each model for ; (d) behavior of groups composed of RHONNs are represented as a combination of the behavior of each RHONN.
TopIntroduction
It is well known that the brain has a stable structure in the sense that damage of a part of neurons does not always destroy its function and may use dynamic attractors to hold memory rather than static states as in most (artificial) neural networks (Amari & Maginu, 1988; Hopfield, 1982; Kohonen, 1988; McEliece, et al., 1987; Palm, 1980; Rumelhart & McClelland, 1986; Wang & Ross, 1991). For this reason, studying dynamics of interconnected neural networks from the macroscopic viewpoint gives a method of approach to understand the information processing method of the brain and a suggestion to construct artificial neural networks. Many studies have been done with macroscopic behavior of neural networks (Amari, 1971, 1972, 1974; Annios, et al., 1970; Venzl, 1976). Amari (1971, 1972, 1974) has investigated macroscopic behavior by using the statistical method of neuro-dynamics. As a result, it is shown that there exist only monostable, bistable, and oscillatory networks in this case (Amari, 1974). Macroscopic dynamics of the traditional model whose potential is represented as the linear sum of weights and input vector is simple and it gives the limit of this model. Therefore, in the traditional model, complex behavior is performed by combining simple neural networks, using nonmonotone output function and changing signals from the outside (Hjelmfelt & Ross, 1994; Hopfield, 1982; Moreira & Auto, 1993; Morita, et al., 1990; Morita, 1993; Yao, et al., 1991). On the other hand, Higher Order Neural Networks (HONNs), whose potential is computed by using the second- and third-order product, have been proposed as a generalized model of the traditional one (Rumelhart & McClelland, 1986). It is known that HONNs are superior in associative memory and combinatorial optimization problems to the traditional neural networks (Cheng & Lee, 1993; Cooper, 1995; Gile & Maxwell, 1988; Miyajima, et al., 1996, 1995; Perantonis & Lisboa, 1992; Psaltis, et al., 1988; Simpson, 1990; Villalobos & Merat, 1995; Yatsuki & Miyajima, 1997). In particular, in associative memory complex dynamics of HONNs leads to development of a powerful algorithm (Psaltis, et al., 1988; Simpson, 1990; Yatsuki & Miyajima, 1997). Then, can HONNs perform complex dynamics in the macroscopic or microscopic sense really? It is necessary to investigate not only superiority for application of networks but also qualitative difference of characteristics between HONNs and the traditional neural networks. Such studies of HONNs from the macroscopic viewpoint are little made. Randomly connected neural networks are proposed as a convenient model for investigating the macroscopic properties of neural networks and can easily extend to the case of HONNs (Amari, 1971, 1972, 1974; Cooper, 1995).