A Survey on Algorithms in Deep Learning

A Survey on Algorithms in Deep Learning

Sindhu P. Menon
DOI: 10.4018/978-1-7998-2803-7.ch017
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In the last couple of years, artificial neural networks have gained considerable momentum. Their results could be enhanced if the number of layers could be made deeper. Of late, a lot of data has been generated, which has led to big data. This comes along with many challenges like quality, which is one of the most important ones. Deep learning models can improve the quality of data. In this chapter, an attempt has been made to review deep supervised and deep unsupervised learning algorithms and the various activation functions used. Challenges in deep learning have also been discussed.
Chapter Preview
Top

Introduction

A typical neural network will have a number of neurons interconnected in all possible ways just like how the neurons in our brain our connected. At any point of time, all neurons need not be active. The hidden layers which are present in-between the input and output layer play a very important role, as this number goes high, they are termed as Deep Neural Networks(DNN). The main advantage of DNN when compared with an artificial neural network is that in DNNs features are identified automatically. More the number of hidden layers, more number of features get identified. This results in better prediction accuracy. The number of neurons in the input layer depends on the number of features. For instance, Suppose we want to train the system to identify the image in Fig 1. This can be assumed to be a matrix of size 35*35 pixels. Hence the total number of pixels are 1225. The number of neurons, in the input layer will be 1225 as this is the number of features identified.

Figure 1.

Image for Training

978-1-7998-2803-7.ch017.f01
(Courtesy: Google)

Algorithms for training such networks can be classified as Supervised and Unsupervised. In the next section an overview of these algorithms will be given.

The article contains the Introduction in Section 1, followed by Literature Survey in Section 2. A survey into deep learning models is discussed in Section 3 followed by the challenges in training them in Section 4. Finally, the article concludes in Section 5.

Top

Literature Survey

A lot of surveys have been done on various areas like health, object detection, medical image analysis, cancer detection, agriculture, sentiment analysis and many more. Almost 300 papers were published only in medical imaging in 2016. The need for deep nets came from the fact that features had to be learnt efficiently. CNN was the first network introduced in this direction. Since early seventies, a lot of work on CNN has been done((Fukushima, 1980) and the same was developed on analysing medical image in 1995(Lo et al. (1995). Using this concept, they achieved success in LeNet, their first real world application to recognise hand written digits (LeCun et al., 1998). The number of layers in LeNet and AlexNet(t (Krizhevsky et al., 2012) hadtwo and five layers with a design such that larger layers were closer to input and smaller layers were closer to output. The activation function used in AlexNet was RLU. But it didn’t gain enough momentum despite its success. The belief earlier was that it is very difficult to train deep nets. Momentum was gained in 2006 when it was shown by (Bengio et al., 2007; Hinton and Salakhutdinov, 2006; Hinton et al., 2006) that if it was trained layerwise, by stacking networks, the performance could be improved. More complex layers were stacked to improve the efficiency of these models. AN Inception model which has 22 layers (Szegedy et al. (2014) was introduced with varying size convolutions. This was also names as GoogLeNet.

A number of surveys exist in the field of agriculture (Deng & Yu, 2014), (Wan, et al., 2014), (Najafabadi, et al., 4 2015). To handle the issues related to agriculture, smart techniques for farming are needed(Tyagi, 2016). A survey on the agricultural practices and how to use them were discussed((Kamilaris, Gao, Prenafeta-Boldú, & Ali, 2016)

Complete Chapter List

Search this Book:
Reset