Computational Intelligence (CI) consists of an evolving collection of methodologies often inspired from nature (Bonissone, Chen, Goebel & Khedkar, 1999, Fogel, 1999, Pedrycz, 1998). Two popular methodologies of CI include neural networks and fuzzy systems. Lately, a unification was proposed in CI, at a “data level”, based on lattice theory (Kaburlasos, 2006). More specifically, it was shown that several types of data including vectors of (fuzzy) numbers, (fuzzy) sets, 1D/2D (real) functions, graphs/trees, (strings of) symbols, etc. are partially(lattice)-ordered. In conclusion, a unified cross-fertilization was proposed for knowledge representation and modeling based on lattice theory with emphasis on clustering, classification, and regression applications (Kaburlasos, 2006). Of particular interest in practice is the totally-ordered lattice (R,=) of real numbers, which has emerged historically from the conventional measurement process of successive comparisons. It is known that (R,=) gives rise to a hierarchy of lattices including the lattice (F,=) of fuzzy interval numbers, or FINs for short (Papadakis & Kaburlasos, 2007). This article shows extensions of two popular neural networks, i.e. fuzzy-ARTMAP (Carpenter, Grossberg, Markuzon, Reynolds & Rosen 1992) and self-organizing map (Kohonen, 1995), as well as an extension of conventional fuzzy inference systems (Mamdani & Assilian, 1975), based on FINs. Advantages of the aforementioned extensions include both a capacity to rigorously deal with nonnumeric input data and a capacity to introduce tunable nonlinearities. Rule induction is yet another advantage.
Lattice theory has been compiled by Birkhoff (Birkhoff, 1967). This section summarizes selected results regarding a Cartesian product lattice (L,≤)= (L1,≤1)×…×(LN,≤N) of constituent lattices (Li,≤i), i=1,…,N.
Given an isomorphic function θi: (Li,≤i)→(Li,≤i)∂ in a constituent lattice (Li,≤i), i=1,…,N, where (Li,≤i)∂ ≡ (Li,≤) denotes the dual (lattice) of lattice (Li,≤i), then an isomorphic function θ: (L,≤)→(L,≤)∂ is given by θ(x1,…,xN)=(θ1(x1),…,θN(xN)).
Given a positive valuation function vi: (Li,≤i)→R in a constituent lattice (Li,≤i), i=1,…,N then a positive valuation v: (L,≤)→R is given by v(x1,…,xN)=v1(x1)+…+vN(xN).
It is well-known that a positive valuation vi: (Li,≤i)→R in a lattice (Li,≤i) implies a metric function di: Li×Li→ given by di(a,b) = vi(a∨b) - vi(a∧b).
Minkowski metrics dp: (L1,≤1)×…×(LN,≤N)= (L,≤)→R are given bydp(x,y) = ,wherex
Key Terms in this Chapter
Dual (Lattice): Given a lattice (L,=), its dual lattice, symbolically (L,=)? or (L,=?) = (L,=), is a lattice with the inverse order relation (=).
Isomorhic (Function): Given two lattices (L1,=1) and (L2,=2), an isomorphic function is a bijective (one-to-one) function f: (L1,=1)?(L2,=2) such that x=y ? f(x)=f(y).
ART: ART stands for Adaptive Resonance Theory. That is a biologically inspired neural paradigm for, originally, clustering binary patterns. An analog pattern version of ART, namely fuzzy-ART, is applicable in the unit hypercube. The corresponding neural network for classification is called fuzzy-ARTMAP.
Rule Induction: Process of learning, from cases or instances, if-then rule relationships that consist of an antecedent (if-part, defining the preconditions or coverage of the rule) and a consequent (then-part, stating a classification, prediction, or other expression of a property that holds for cases defined in the antecedent).
Positive Valuation (Function): Given a lattice (L,=), a positive valuation is a function v: (L,=)?R, which satisfies both v(x)+v(y) = v(x?y)+v(x?y) and x
Subattice: A sublattice (S,=) of a lattice (L,=) is another lattice such that both S?L and x,y?S ? x?y,x?y?S.
Lattice: A lattice is a poset (L,=) any two of whose elements have both a greatest lower bound (g.l.b.), denoted by x?y, and a least upper bound (l.u.b.), denoted by x?y.
FIS: FIS stands for Fuzzy Inference System. That is an architecture for reasoning involving fuzzy sets (typically fuzzy numbers) based of fuzzy logic.
Poset: A partially ordered set (or, poset, for short) is a pair (P,=), where P is a set and = is an order relation on P. The latter (relation) by definition satisfies (1) x=x, (2) x=y and y=x ? x = y, and (3) x=y and y=z ? x=z.
SOM: SOM stands for Self-Organizing Map. That is a biologically inspired neural paradigm for clustering analog patterns. SOM is often used for visualization of nonlinear relations of multi-dimensional data.