Fuzzy Decision-Tree-Based Analysis of Databases

Malcolm Beynon (Cardiff University, UK)
DOI: 10.4018/978-1-59904-853-6.ch031

Abstract

The general fuzzy decision tree approach encapsulates the benefits of being an inductive learning technique to classify objects, utilising the richness of the data being considered, as well as the readability and interpretability that accompanies its operation in a fuzzy environment. This chapter offers a description of fuzzy decision tree based research, including the exposition of small and large fuzzy decision trees to demonstrate their construction and practicality. The two large fuzzy decision trees described are associated with a real application, namely, the identification of workplace establishments in the United Kingdom that pay a noticeable proportion of their employees less than the legislated minimum wage. Two separate fuzzy decision tree analyses are undertaken on a low-pay database, which utilise different numbers of membership functions to fuzzify the continuous attributes describing the investigated establishments. The findings demonstrate the sensitivity of results when there are changes in the compactness of the fuzzy representation of the associated data.

Key Terms in this Chapter

Decision Attribute: It is an attribute that characterises an object. Within a decision tree it is part of a leaf node, so it performs as a consequent in the decision rules from the paths down the tree to the leaf node.

Interactive Dichotomizer 3 (ID3): ID3 is a well-known technique for decision tree construction using an entropy measure to define the repetitive partitioning of the objects through the augmentation of attributes down a tree. Originally used in a crisp environment, it has often been developed to operate within a fuzzy environment.

Condition Attribute: It is an attribute that describes an object. Within a decision tree it is part of a nonleaf node, so it performs as an antecedent in the decision rules used for the final classification of an object.

Minimum Truth Level: Within the construction process of a fuzzy decision tree, this is the minimum value of a subsethood measure necessary for a node at the current bottom of a path down the tree to be made a leaf node. The assignment of minimum truth level can be made by an expert, supported by knowledge on the complexity of the resultant fuzzy decision tree desired.

Membership Function: A membership function quantifies the grade of membership of a variable to a linguistic term.

Root Node: This is the node at the top of a decision tree, from which all paths originate and lead to a leaf node.

Exact Complete Context Spaces (ECCS): ECCS constrains the membership values from the linguistic terms making up a linguistic variable to sum to one. One of the considered benefits of operating within ECCS is the ability to use the near optimum number of membership functions required.

Linguistic Term: It is one of a set of linguistic terms that are subjective categories for a linguistic variable, each described by a membership function.

Node: It is a junction point down a path in a decision tree that describes a condition in an if-then decision rule. From a node, the current path may separate into two or more paths.

Induction: It is a technique that infers generalizations from the information in the data.

Dominant Support: It is the domain of the linguistic variable where there is a majority association of an object to the respective linguistic term.

Leaf Node: It is a node not further split (the terminal grouping) in a classification or decision tree. A path down the tree to a leaf node contains the complements for the respective decision rule.

Linguistic Variable: It is a variable made up of a number of words (linguistic terms) with associated degrees of membership.

Decision Tree: It is a tree-like way of representing a collection of hierarchical decision rules that leads to a class or value, starting from a root node ending in a series of leaf nodes.

Path: It is a path down the tree from root node to leaf node, also termed a branch.

Complete Chapter List

Search this Book:
Reset
Program Committee
Foreword
Preface
José Galindo
Acknowledgment
Chapter 1
José Galindo
\$37.50
Chapter 2
Slawomir Zadrozny, Guy de Tré, Rita de Caluwe, Janusz Kacprzyk
\$37.50
Chapter 3
Balazs Feil, Janos Abonyi
\$37.50
Chapter 4
\$37.50
Chapter 5
Noureddine Mouaddib, Guillaume Raschia, W. Amenel Voglozin, Laurent Ughetto
\$37.50
Chapter 6
P Bosc, A Hadjali, O Pivert
\$37.50
Chapter 7
Guy De Tré, Marysa Demoor, Bert Callens, Lise Gosseye
\$37.50
Chapter 8
Bordogna Bordogna, Guiseppe Psaila
\$37.50
\$37.50
Chapter 10
Ludovic Liétard, Daniel Rocacher
\$37.50
Chapter 11
Angélica Urrutia, Leonid Tineo, Claudia Gonzalez
\$37.50
Chapter 12
Rallou Thomopoulos, Patrice Buche, Ollivier Haemmerlé
\$37.50
Chapter 13
Troels Andreasen, Henrik Bulskov
\$37.50
Chapter 14
Mohamed Ali Ben Hassine, Amel Grissa Touzi, José Galindo, Habib Ounelli
\$37.50
Chapter 15
Geraldo Xexéo, André Braga
\$37.50
Chapter 16
Aleksandar Takaci, Srdan Škrbic
\$37.50
Chapter 17
Carlos D. Barranco, Jesús R. Campaña, Juan M. Medina
\$37.50
Chapter 18
\$37.50
Chapter 19
Markus Schneider
\$37.50
Chapter 20
Yauheni Veryha, Jean-Yves Blot, Joao Coelho
\$37.50
Chapter 21
Yan Chen, Graham H. Rong, Jianhua Chen
\$37.50
Chapter 22
R. A. Carrasco, F. Araque, A. Salguero, M. A. Vila
\$37.50
Chapter 23
Andreas Meier, Günter Schindler, Nicolas Werro
\$37.50
Chapter 24
Shyue-Liang Wang, Ju-Wen Shen, Tuzng-Pei Hong
\$37.50
Chapter 25
\$37.50
Chapter 26
Awadhesh Kumar Sharma, A. Goswami, D. K. Gupta
\$37.50
Chapter 27
Wai-Ho Au
\$37.50
\$37.50
\$37.50
Chapter 30
Hamid Haidarian Shahri
\$37.50
Chapter 31
Malcolm Beynon
\$37.50
Chapter 32
Malcolm Beynon
\$37.50
Chapter 33
J. I. Peláez, J. M. Doña, D. La Red
\$37.50
Chapter 34
Safìye Turgay
\$37.50