Preprocessing Perceptrons and Multivariate Decision Limits

Preprocessing Perceptrons and Multivariate Decision Limits

Patrik Eklund, Lena Kallin Westin
DOI: 10.4018/978-1-60566-218-3.ch005
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Classification networks, consisting of preprocessing layers combined with well-known classification networks, are well suited for medical data analysis. Additionally, by adjusting network complexity to corresponding complexity of data, the parameters in the preprocessing network can, in comparison with networks of higher complexity, be more precisely understood and also effectively utilised as decision limits. Further, a multivariate approach to preprocessing is shown in many cases to increase correctness rates in classification tasks. Handling network complexity in this way thus leads to efficient parameter estimations as well as useful parameter interpretations.
Chapter Preview
Top

The Preprocessing Perceptron

A linear regression is given by the weighted sum

978-1-60566-218-3.ch005.m01
where xi are inputs and wi and γ are parameters of the linear function. A logistic regression performs a sigmoidal activation of the weighted sum, i.e.,

Note that a logistic regression function is precisely a (single-layer) perceptron in the terminology of neural networks (Duda et. al., 2001). The preprocessing perceptron consists similarly of a weighted sum but includes preprocessing functions for each input variable. Suitable preprocessing functions are sigmoids (sigmoidal functions)

978-1-60566-218-3.ch005.m03
(1) where α is the parameter representing the position of the inflexion point, i.e., the soft cut-off or decision limit, and β corresponds1 to the slope value at the inflexion point. Often the sigmoid is also used as an activation function in neural networks. In this case, α is usually set 0 and β is set to 1, i.e.,

978-1-60566-218-3.ch005.m04
(2)

Complete Chapter List

Search this Book:
Reset