Reference Hub1
Cost-Sensitive Classification Using Decision Trees, Boosting and MetaCost

Cost-Sensitive Classification Using Decision Trees, Boosting and MetaCost

Kai Ming Ting
Copyright: © 2002 |Pages: 27
ISBN13: 9781930708266|ISBN10: 1930708262|EISBN13: 9781591400172
DOI: 10.4018/978-1-930708-26-6.ch003
Cite Chapter Cite Chapter

MLA

Ting, Kai Ming. "Cost-Sensitive Classification Using Decision Trees, Boosting and MetaCost." Heuristic and Optimization for Knowledge Discovery, edited by Hussein A. Abbass, et al., IGI Global, 2002, pp. 27-53. https://doi.org/10.4018/978-1-930708-26-6.ch003

APA

Ting, K. M. (2002). Cost-Sensitive Classification Using Decision Trees, Boosting and MetaCost. In H. Abbass, C. Newton, & R. Sarker (Eds.), Heuristic and Optimization for Knowledge Discovery (pp. 27-53). IGI Global. https://doi.org/10.4018/978-1-930708-26-6.ch003

Chicago

Ting, Kai Ming. "Cost-Sensitive Classification Using Decision Trees, Boosting and MetaCost." In Heuristic and Optimization for Knowledge Discovery, edited by Hussein A. Abbass, Charles S. Newton, and Ruhul Sarker, 27-53. Hershey, PA: IGI Global, 2002. https://doi.org/10.4018/978-1-930708-26-6.ch003

Export Reference

Mendeley
Favorite

Abstract

This chapter reports results obtained from a series of studies on costsensitive classification using decision trees, boosting algorithms, and MetaCost which is a recently proposed procedure that converts an errorbased algorithm into a cost-sensitive algorithm. The studies give rise to new variants of algorithms designed for cost-sensitive classification, and provide insights into the strength and weaknesses of the algorithms. First, we describe a simple and effective heuristic of converting an error-based decision tree algorithm into a cost-sensitive one via instance weighting. The cost-sensitive version performs better than the error-based version that employs a minimum expected cost criterion during classification. Second, we report results from a study on four variants of cost-sensitive boosting algorithms. We find that boosting can be simplified for costsensitive classification. A new variant which excludes a factor used in ordinary boosting has an advantage of producing smaller trees and different trees for different scenarios; while it performs comparably to ordinary boosting in terms of cost. We find that the minimum expected cost criterion is the major contributor to the improvement of all cost-sensitive adaptations of ordinary boosting. Third, we reveal a limitation of MetaCost. We find that MetaCost retains only part of the performance of the internal classifier on which it relies. This occurs for both boosting and bagging as its internal classifier.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global bookstore.