Artificial Neural Networks and Their Applications in Business

Artificial Neural Networks and Their Applications in Business

DOI: 10.4018/978-1-5225-7368-5.ch074
(Individual Chapters)
No Current Special Offers


Despite the natural advantage humans have for recognizing and interpreting patterns, large and complex datasets, as in big data, preclude efficient human analysis. Artificial neural networks (ANNs) provide a family of pattern recognition approaches for prediction, clustering, and classification applicable to KDD with ANN model complexity ranging from simple (for small problems) to highly complex (for large issues). To provide a starting point for readers, this chapter first describes foundational concepts that relate to ANNs. A listing of commonly used ANN methods, heuristics, and criteria for initializing ANNs are then discussed. Common pre- and post-data processing methods for dimensionality reduction and data quality issues are then described. The authors then provide a tutorial example of ANN analysis. Finally, the authors list and describe applications of ANNs to specific business-related endeavors for further reading.
Chapter Preview


Perspectives of Chapter

Knowledge discovery in datasets (KDD), the process of finding non-obvious patterns or trends in datasets primarily to assist in understanding complicated systems (Mannila, 1996), leveraging computational pattern recognition methods to predict, describe, classify, group, and categorization data (Jain, Duin, & Mao, 2000). The nonlinearity of ANNs lend to modeling complex data structures; however, this also results in ANNs being complex and opaque to many users (Weckman, et al., 2009). While ANNs see application in business applications, some hesitation and misconceptions appear due to the ‘black-box’ nature of ANNs (Dewdney, 1997); hence traditional statistical-based models are far more used in practice. Although some interpretability aspects exist when applying ANNs (de Marchi, Gelpi, & Grynaviski, 2004), it should be noted that the interpretability of ANNs is subjective relative to other commonly used tools.

Objectives of Chapter

ANNs methods are interconnected networks of nodes, which are trained on patterns in data through statistical learning methods (Jain, Duin, & Mao, 2000). Although ANNs are computationally complex, inherently ANNs are statistical in nature and epistemologically similar in function to Bayesian (Beck, King, & Zeng, 2004) and likelihood methods (Verikas & Bacauskiene, 2002). Various software packages are now available for practitioners, including NeuroDimensions (2005), Matlab (2010), JMP (Sall, Lehman, Stephens, & Creighton, 2012), and R (2008). The objective of this chapter is to provide readers with a general background of ANNs, their business applications, and developing quality ANN models. The target audience is intended to be readers who may not be familiar with this form of mathematical modeling practice but may want to pursue it for their business need.

In Young, Bihl, and Weckman (2014), the authors provided a brief overview of ANNs and their applications in business. Herein, the authors have revised this discussion and included new material on ANN architectures and an example end-to-end analysis of business data using ANNs with the JMP13 Pro platform.

Biological Inspiration of ANNs

ANNs are computational machine learning models that are neurologically inspired with the intent of representing complex non-linear input and output relationships. The earliest known development of the ANN model was in the 1940s (McCulloch & Pitts, 1943), with the fundamental building block of the ANN model being the neuron. Figure 1 displays a basic sketch of the biological neural network model (Neuralpower, 2007).

Figure 1.

Biological Neuron (Young, Holland, & Weckman, 2008)


Computational ANNs extend biological neuron models by considering multiple interconnected nodes termed “neurons” which employ statistical methods to learn patterns between inputs and outputs (Jain, Duin, & Mao, 2000). Through organizational and iterative principles, connection weights between neurons, inputs and outputs are computed to learn a nonlinear input-output relationship (Jain, Duin, & Mao, 2000).

Complete Chapter List

Search this Book: