Decision trees are part of the decision theory and are excellent tools in the decision-making process. Majority of decision tree learning methods were developed within the last 30 years by scholars like Quinlan, Mitchell, and Breiman, just to name a few (Ozgulbas & Koyuncugil, 2006). There are a number of methods and sophisticated software used to graphically present decision trees. Decision trees have a great number of benefits and are widely used in many business functions as well as different industries. However there are also disagreements and various concerns as to how useful decision trees really are. As technology evolves so do decision trees. Therefore not only do many controversies arise but also solutions and new proposals to these arguments.
Key Terms in this Chapter
Deductive Reasoning: Reasoning in which the conclusion is necessitated by previously known facts—the premises, if the premises are true, the conclusion must be true.
Algorithm: A step-by-step problem-solving procedure, especially an established, recursive computational procedure for solving a problem in a finite number of steps.
Decision Tree: A diagram consisting of square decision nodes, circle probability nodes, and branches representing decision alternatives.
Inductive Logic: Process of reasoning in which the premises of an argument support the conclusion but do not ensure it.
Data Mining: Process of automatically searching large volumes of data for patterns.
Machine Learning: Development of algorithms and techniques that allow computers to “learn”.
Overfitting: Fitting a statistical model that has too many parameters.
Cl assification: Synonymous with what is commonly known in machine learning as clustering.