Article Preview
TopA number of approaches in the literature have taken account the architecture optimization as well as others consider the use of different activation function in the same network for only the Multilayer Perceptron with one hidden layer (Ivanov & Gavrilaş, 2014). This section describes only those works that are more or less similar to our work.
Global search may stop the convergence to a non-optimal solution and determine the optimum number of ANN hidden layers. Recently, some studies in the optimization architecture problems have been introduced in order to determine neural networks parameters, but not optimally (Lins & Ludermir, 2005).
Traditional algorithms fix the neural network architecture before learning (JOSEPH, 2008). Others studies propose constructive learning (Wang, Chaudhari, & Patra, 2004; Wang 2008). It begins with a minimal structure of hidden layers; these researchers initialized the hidden layers, with a minimal number of hidden layer neurons. The most of researchers treat the construction of neural architecture (structure) without finding the optimal neural architecture.
The authors in (Ludermir, Yamazaki, & Zanchettin, 2006) propose an approach for dealing with a few connections in one hidden layer and training with different hybrid optimization algorithms, and in other work they use different activation function in group of networks (Gomes, Ludermir, & Lima, 2011).
In our previous work we take account the optimization of hidden layers, connections used with introducing one decision variable for layers and another one for connections (Ramchoun, Idrissi, Ghanou, & Ettaouil, 2017), and in another work we have taken account the hidden node optimization in layers, for training this two models authors have used a back-propagation algorithms (Ettaouil, Lazaar, & Ghanou, 2013).