Breast Cancer Diagnosis Using Optimized Attribute Division in Modular Neural Networks

Breast Cancer Diagnosis Using Optimized Attribute Division in Modular Neural Networks

Rahul Kala, Anupam Shukla, Ritu Tiwari
Copyright: © 2011 |Pages: 14
DOI: 10.4018/jitr.2011010103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The complexity of problems has led to a shift toward the use of modular neural networks in place of traditional neural networks. The number of inputs to neural networks must be kept within manageable limits to escape from the curse of dimensionality. Attribute division is a novel concept to reduce the problem dimensionality without losing information. In this paper, the authors use Genetic Algorithms to determine the optimal distribution of the parameters to the various modules of the modular neural network. The attribute set is divided into the various modules. Each module computes the output using its own list of attributes. The individual results are then integrated by an integrator. This framework is used for the diagnosis of breast cancer. Experimental results show that optimal distribution strategy exceeds the well-known methods for the diagnosis of the disease.
Article Preview
Top

Introduction

There has been a vast amount of research in the use of neural networks for problem solving. The neural networks are extensively used for a variety of problems including biometrics, bioinformatics, robotics, and so forth (Shukla, Tiwari, & Kala, 2010a). The ease of modelling and use makes the neural networks good problem solving agents. The neural networks carry the task of machine learning. Here a training database is given to the system. This database is a source of large amount of information regarding patterns, trends, and knowledge about the problem domain. The task of the learning algorithm is to extract this knowledge and use it as per the system knowledge representation. In the neural network this knowledge is in the form of weights between the various neurons and the individual neuron biases. A commonly used architecture of the neural networks is the Multi-Layer Perceptron. Here the various neurons are arranged in a layered manner, the first layer being the input layer and the last being output layer. The input and output layer may be separated by a number of hidden layers. Back Propagation Algorithm is commonly used for training the neural networks. This algorithm works over the gradient descent approach for fixing the various weights and biases. The back propagation algorithm is however likely to get struck at some local minima, considering the very complex nature of the search space over which it operates (Konar, 1999).

The weakness in the various soft computing paradigms has led to the emergence of the field of hybrid soft computing. Here we mix two similar or different paradigms so as to magnify the advantages of each of these and diminish their disadvantages. This coupling of individual systems may result in complementation of the limitations of the systems, for an overall enhanced performance. The evolutionary neural networks are commonly used hybrid systems, where neural modelling fuses with evolutionary computation to result in good problem solving agents.

The architecture of the neural networks is a major criterion that decides the system performance. The traditional neural networks use human expertise to design the optimal architecture, which may then be trained by the training algorithm. This however is a human-intensive task which may hence yield sub-optimal results. The training algorithm in turn may get struck at some local minima, with very poor exploration of the search space. The evolutionary algorithms are very strong optimizing agents that optimize the given problem in an iterative manner, and fix all the values of the parameters so as to optimize the final objective (Mitchell, 1999). Evolutionary neural networks hence use the optimization potential of the evolutionary algorithms for evolving the complete architecture of the neural networks, along with the weights and biases (Nolfi, Parisi, & Elman, 1990; Yao, 1999). Many times the evolutionary process may be assisted by a local search strategy like BPA or simulated annealing to search for local minima in the vicinity of the current location of the evolutionary individual in the search space (Yao, 1993).

Classification is a fundamental problem of study. The classification system is given a set of features as inputs, and is expected to return the class to which the input belongs as the output. The classifier is supposed to build the decision boundaries in the feature space that separates the various classes. Ideally the features must be such that the various instances of the classes have a high inter-class separation and low intra class separation. This makes it very easy for the classifier to construct decision boundaries across the various classes, separating them from each other. Every input attribute in this classifier is a dimension in the feature space. The additional dimensions usually make the task of construction of the decision boundary by the classifier easier. Two classes lying very close to each other may get separated by the addition of some dimension. This however may require more training instances, and would result in immense increase of computation time. The decision boundaries, across various dimensions, may become very complex and difficult to model and train (Shukla, Tiwari, & Kala, 2010b; Kala, Shukla, & Tiwari, 2009). Hence the number of inputs to the classifier needs to be limited in nature.

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 15: 6 Issues (2022): 1 Released, 5 Forthcoming
Volume 14: 4 Issues (2021)
Volume 13: 4 Issues (2020)
Volume 12: 4 Issues (2019)
Volume 11: 4 Issues (2018)
Volume 10: 4 Issues (2017)
Volume 9: 4 Issues (2016)
Volume 8: 4 Issues (2015)
Volume 7: 4 Issues (2014)
Volume 6: 4 Issues (2013)
Volume 5: 4 Issues (2012)
Volume 4: 4 Issues (2011)
Volume 3: 4 Issues (2010)
Volume 2: 4 Issues (2009)
Volume 1: 4 Issues (2008)
View Complete Journal Contents Listing