Artificial Intelligence a Driver for Digital Transformation

Artificial Intelligence a Driver for Digital Transformation

Maria José Sousa, Gabriel Osório de Barros, Nuno Tavares
DOI: 10.4018/978-1-7998-4201-9.ch014
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Artificial intelligence is reconfiguring the economy and redefining the product and service market. It is a disruptive technology that leads to the creation of multiple more efficient activities, new business models, and industrial processes. The literature stresses that AI should be used in all aspects of the personal lives of organisations and individuals, and such complexities are still largely unstudied. The aim of this study is to highlight AI's innovations and applications to the organisation's digital transformation.
Chapter Preview
Top

1. Artificial Intelligence A Driver For Digital Transformation

In recent years, there have been increasing interests about Artificial intelligence (AI) from academics and practitioners (Dwivedi et al, 2019) and concepts, such as machine learning (ML). Artificial intelligence is related to machines' ability to think as human beings (Wang, 2019) - to have the power to learn reason, perceive and decide in a rational and intelligent way. The technologies associated with artificial intelligence are Machine Learning, Deep Learning, Natural Language Processing, among others (Mayo, and Leung, 2018; Montes, and Goertzel, 2019).

Machine Learning involves machines that learn, from the data that are introduced to them and with a minimum of programming, reaching the results autonomously (i.e. the custom recommendations on Amazon). Machine learning makes it possible to construct a mathematical model from data, including many variables that are not known in advance. The parameters are configured along the learning process, with training data sets. The different machine learning methods are classified into 3 categories: human-supervised learning, unsupervised learning, and unsupervised learning by reinforcement. These 3 categories group, together, different methods including neural networks, deep learning, and others (Figure 1).

Figure 1.

Machine Learning Dimensions

978-1-7998-4201-9.ch014.f01
Source: Council of Europe, 2019

Deep Learning: when machines use complex algorithms to mimic the neural network of the human brain and learn an area of ​​knowledge with virtually no supervision (Schmidhuber, 2015).

Natural Language Processing (PLN): These are machine learning techniques used to find patterns in large data sets and recognize natural language (Huang, and Rust, 2018). For example, the application of PLN to the analysis of feelings, where algorithms can look for patterns in publications in social networks to understand how customers feel about specific brands and products.

Automation, cloud computing and IoT are followed by artificial intelligence, or AI, to enable the smarter machine, the smarter factory and the smarts ecosystem (Wan et al, 2018; Sousa et al. 2019).

AI is driven by the combination of almost limitless computing power in the cloud, the digitization of our world (Huang, and Rust, 2018), and breakthroughs in how computers can use this information to learn and reason much like people do.

By applying advanced AI technologies, such as machine learning and cognitive services (Huang, and Rust, 2018) against the data coming in from the manufacturing process, you now have a value-added layer of insight into your data. This allows you to improve operational efficiencies, speed production, optimize equipment performance, minimize waste, and reduce maintenance costs.

Advancements in AI are also opening a hybrid workforce where people and machines work together (Jarrahi, 2018). According to IDC, by 2020, 60 percent of plant floor workers at G2000 manufacturers will work alongside assistance technologies that enable automation, such as robotics, 3D printing, AI, and AR/VR.

Artificial intelligence is a technology which has been improving the performance of manufacturing and services sectors (Li et al, 2017). This study presents a holistic analysis of AI, namely theoretical frameworks and practical experiences. It provides a broad review of recent developments within the field of AI and its applications (Murdoch, et al, 2019).

Top

To reply to the first research question RQ1: What are the main AI research Trends? an analysis is made on the scientific publishing based on Scopus database (figure 2).

Figure 2.

Trends in scientific publishing related to AI, 2006-2016

978-1-7998-4201-9.ch014.f02

Key Terms in this Chapter

Clustering: Algorithm technique that allows machines to group similar data into larger data categories.

AI Software Platforms: Are used to build intelligent applications that provide predictions, answers, or recommendations. These applications automatically learn, adapt, and improve over time using information access processes combined with deep/machine learning.

Bayesian Networks: Is a graph-based model representing a set of variables and their dependencies, focused on decision-making processes.

Machine Translation: An application of NLP used for language translation (human-to-human) in text- and speech-based conversations.

Deep Learning: Is a sub-discipline of machine learning that utilizes artificial neural network configured across multiple layers.

Computer Vision: How machines visually and interpret the world and understand it from images or videos.

Machine Learning: Is scientific study of algorithms and statistical models that perform various functions without having to be programmed by a human, and without using explicit instructions, relying on patterns and inference. ML algorithms build a mathematical model based on sample data, known as “training data”, in order to make predictions or decisions. Its application to business is referred as predictive analytics.

Keyword and Phrase Recognition: It is a system capacity to recognize specific words.

Knowledge Bases: Repositories of information organized as entities or intents linked together so that the application can find answers to questions or ambiguous references.

Big Data: Large amounts of structured and unstructured data that is too complex to be handled by standard data-processing software.

Unsupervised Learning: A type of machine learning where an algorithm is trained with information that is neither classified nor labelled, thus allowing the algorithm to act without guidance (or supervision).

Natural Language Processing (NLP): Focuses on the interactions between computers and human language in both the spoken and written form.

Unstructured Data: Raw data (i.e., audio, video, social media content).

Natural Language Generation: Is the capacity to construct textual and conversational narratives from structured or semi-structured data.

Robotic Process Automation (RPA): Uses software with AI and ML capabilities to perform repetitive tasks.

Data Science: Composed by mathematics, statistics, probability, computing, and data visualization to extract knowledge from a heterogeneous set of data (images, sound, text, genomic data, social network links, physical measurements, and others sources).

Voice and Speech Recognition: Are the translation of spoken or typed phrases into text to prepare it for analysis.

Data Mining: The process of analyzing a large volume of data and bring out models, correlations and trends, in order to identify recurring patterns while establishing problem-solving relationships.

Artificial Intelligence: Is software and hardware combined and that that attempts to simulate a human being.

Robotics: Focused on the design and manufacturing of robots that exhibit and/or replicate human intelligence and actions.

Artificial Neural Network (ANN): Is an algorithm that attempts to replicate the operation of the human brain through the utilization of connected neurons which are organized in layers and send information to each other.

Chatbots: A chat robot that can converse with a human user through text or voice commands.

Algorithm: Is a set of rules to solve a problem or carry out a routine or calculation.

Optical Character Recognition (OCR): Conversion of images of text (typed, handwritten, or printed) either electronically or mechanically, into machine-encoded text.

Conversational AI Software Platforms: Are a subset of AI platforms specialized for the development of intelligent digital assistants and conversational chatbots. They use content analytics, information discovery, and other technologies to communicate with human beings.

Complete Chapter List

Search this Book:
Reset