Contribution of Neural Networks in Different Applications

Contribution of Neural Networks in Different Applications

DOI: 10.4018/978-1-5225-9902-9.ch016
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Neural networks are very useful and are proving to be very beneficial in various fields. Biomedical applications such as breast cancer image classification, differentiating between the malignant and benign type of breast cancer, etc. are now seen to be making use of neural networks rapidly. Neural networks are showing remarkable results of their effectiveness in these biomedical applications and are proving to be immensely profitable. Another field such as agriculture, which is a very crucial field for survival of human life, can be benefitted from neural networks. Likewise, various fields can gain enormous benefits from the usage of neural networks. This chapter shall explain neural networks in detail. Also, the authors shall provide a brief and detailed insight of the contribution of neural networks in different applications, along with its analysis.
Chapter Preview
Top

Overview Of History Of Neural Networks

Many people have contributed in the development of neural networks. Their contributions are as follows:

  • 1943: Warren McCulloch, a neurophysiologist and Walter Pitts, a mathematician came together and wrote regarding artificial neurons. They also developed and modeled neural network. This was the first time when neural networks were introduced to the world.

  • 1949: Donald O. Hebb wrote a book titled “The Organization of Behavior” which gave information regarding neural networks learning process and explained regarding neurons. It stated that connection of two neurons strengthens when both of them are activated at the same time.

  • 1951: Marvin Minsky developed a neurocomputer for neural networks that adjusted weights automatically but it was not implemented practically.

  • 1956: A memory network was developed and research regarding neural networks and neurons was continued.

  • 1958: Frank Rosenblatt developed and successfully implemented Mark I Perceptron which was a neurocomputer having the capability of recognizing different numerical using image sensor and it helped in cases where input classes were separable linearly.

  • 1960: Bernard Widrow and Marcian E. Hoff developed Adaptive Linear Neuron (ADALINE). It was the first commercially used neural network.

  • 1961: Karl Steinbuch implemented the concept of associative memory. These implementations are seen as the predecessors of present neural networks associative memory. He even explained various concepts for neural networks.

  • 1965: In this year, Nils Nilsson wrote a book titled Learning Machines which explained about neural networks and also stated regarding their progress.

  • 1969: Marvin Minsky and Seymour Papret published a discovery regarding Perceptron.

  • 1972: During this year, Teuvo Kohonen developed associative memory.

  • 1973: Christoph Malsburgh made use of a neuron model.

  • 1974: Paul Werbos introduced and developed a learning method called backpropagation of error.

  • 1976: Gail Carpenter and Stephen Grossberg introduced and developed adaptive resonance theory.

  • 1982: John Hopfield developed the energy network called the Hopfield energy network.

  • 1985: Hinton, Sejnowski and Ackley introduced and developed Boltzmann machine.

  • 1986: Hinton, Rumelhart and Williams introduced the generalized Delta rule.

  • 1988: In this year, Kosko introduced fuzzy logic concept in ANN and also introduced and developed Binary Associative Memory (BAM).

Key Terms in this Chapter

Artificial Neuron: It is an artificial replication of human brain neuron.

Neural Network: It is a system having functionality akin to the human brain. It incorporates and replicates some abilities of the human brain like the ability of learning.

Neuron: It is a nerve cell used for communication with other cells.

Complete Chapter List

Search this Book:
Reset