Shopping Cart | Login | Register | Language: English

Input Space Partitioning for Neural Network Learning

Volume 4, Issue 2. Copyright © 2013. 11 pages.
OnDemand Article PDF Download
Download link provided immediately after order completion
$30.00
List Price: $37.50
Current Promotions:
20% Online Bookstore Discount*
Available. Instant access upon order completion.
DOI: 10.4018/jaec.2013040105|
Cite Article

MLA

Guo, Shujuan, Sheng-Uei Guan, Weifan Li, Ka Lok Man, Fei Liu and A. K. Qin. "Input Space Partitioning for Neural Network Learning." IJAEC 4.2 (2013): 56-66. Web. 20 Dec. 2014. doi:10.4018/jaec.2013040105

APA

Guo, S., Guan, S., Li, W., Man, K. L., Liu, F., & Qin, A. K. (2013). Input Space Partitioning for Neural Network Learning. International Journal of Applied Evolutionary Computation (IJAEC), 4(2), 56-66. doi:10.4018/jaec.2013040105

Chicago

Guo, Shujuan, Sheng-Uei Guan, Weifan Li, Ka Lok Man, Fei Liu and A. K. Qin. "Input Space Partitioning for Neural Network Learning," International Journal of Applied Evolutionary Computation (IJAEC) 4 (2013): 2, accessed (December 20, 2014), doi:10.4018/jaec.2013040105

Export Reference

Mendeley
Sample PDF Favorite
Input Space Partitioning for Neural Network Learning
Access on Platform
Browse by Subject
Top

Abstract

To improve the learning performance of neural network (NN), this paper introduces an input attribute grouping based NN ensemble method. All of the input attributes are partitioned into exclusive groups according to the degree of inter-attribute promotion or correlation that quantifies the supportive interactions between attributes. After partitioning, multiple NNs are trained by taking each group of attributes as their respective inputs. The final classification result is obtained by integrating the results from each NN. Experimental results on several UCI datasets demonstrate the effectiveness of the proposed method.
Article Preview
Top

Terminology And Concepts

Let denote the promotion rate of two attributes i and j, which is defined by:

(1) where represents the classification error obtained by training with single attribute i, and represents the classification error obtained by training with two attributes i and j. When the promotion rate of two attributes is 1, these two attributes are considered to be mutually supportive for classification. Otherwise, they are considered to be mutually interfered for classification.

To take full advantage of inter-attribute promotions, we compute the average value of all with Any pair of attributes whose classification error is less than this average value is considered to have significant promotion to each other. The smaller the corresponding classification error is, the more significant the promotion is.

In statistics, correlation measures the strength and direction of the linear relationship between two random variables. There exist many ways of calculating correlation. This paper employs the Pearson’s correlation coefficients (Sedgwick, 2012).

Top

Complete Article List

Search this Journal: Reset
Volume 6: 0 Issues (2015)
Volume 5: 2 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing