Higher order neural networks (HONNs) have been proposed as new systems. In this paper, we show some theoretical results of associative capability of HONNs. As one of them, memory capacity of HONNs is much larger than one of the conventional neural networks. Further, we show some theoretical results on homogeneous higher order neural networks (HHONNs), in which each neuron has identical weights. HHONNs can realize shift-invariant associative memory, that is, HHONNs can associate not only a memorized pattern but also its shifted ones.
Numerous advances have been made in developing some intelligent systems inspired by biological neural networks (Rumelhart, McClelland, & the PDP Research Group, 1986; Hertz, Krogh, & Palmer 1991; Kasabov, 1996; Mehrotra, Mohan, & Ranka, 1997; Gupta, Jin, & Homma, 2003). Scientific studies have been done with designing artificial neural networks to solve a variety of problems in pattern recognition, prediction, optimization, associative memory, and control (Hopfield, & Tnak, 1985; Reid, Spirkovska, & Ochoa, 1989; Meir, & Domany, 1987; Gupta, Jin, & Homma, 2003). Associative memory is one of the well-studied applications of neural networks. Numerous associative memory models processing static and sequential patterns have been studied such as auto, mutual and multidirectional associative memory (Amari, & Maginu, 1988; McEliece, Posner, Rodemich, & Venkatesh, 1987; Yanai, & Sawada, 1990; Amari, 1990; Okada, 1996; Oda, & Miyajima, 2001; Amit, Gutfreund, & Sompolinsky, 1985; Kohonen, 1972; Kosko, 1987; Yoshizawa, Morita, & Amari, 1993; Morita, 1996; Abbott, & Arian, 1987; Amari, & Yanai, H., 1993; Hattori, & Hagiwara, 1995). However, it is known that the capability of the conventional associative memory using neural networks is not so high. Therefore, many associative memory models are proposed such as HONNs, in which the potential of a neuron is represented as the weighted sum of products of input variables, and have been applied to associative memory. It has been shown that HONNs have higher associative capability than the conventional neural networks, in which the potential of a neuron is represented as the weighted sum of input variables (Chan, & Michael, 1988; Reid, Spirkovska, & Ochoa, 1989; Abbott, & Arian, 1987; Cheung, & Lee, 1993; Miyajima, Shigei, & Yatsuki, 2012). However, there are little theoretical studies for the reason why HONNs are effective and the capability of HONNs. Further, from the practical points of view, associative memory should be invariant to pattern transformation such as shift, scaling and rotation. However, the conventional neural networks, cannot inherently acquire any transformation invariant property. Therefore, the conventional neural networks require pre-processing of input patterns to support transformed patterns; a transformed pattern is converted into a standard one, and then the standard pattern is inputted to the network. In order that neural networks inherently acquire shift-invariant properties, their structure should be homogeneous like cellular automata (Wolfram, 1984). Then, how many patterns can HONNs memorize and how about HHONNs?