Augmented Data Prediction Efficiency for Wireless Sensor Network Application by AI-ML Technology

Augmented Data Prediction Efficiency for Wireless Sensor Network Application by AI-ML Technology

Jeba Kumar R. J. S., Roopa JayaSingh J., Alvino Rock C.
DOI: 10.4018/978-1-7998-5068-7.ch017
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Practical wireless sensor network (WSN) demands cutting-edge artificial intelligence (AI) technology like deep learning (DL), which is the subset of AI paradigm to impart intelligence to end devices or nodes. Innovation of AI in WSN aids the enhanced connected world of internet of things (IoT). AI is an evolving area of intelligent learning methodologies by computers via machine learning algorithms (MLA). This chapter entirely deals with the implementation of AI technologies in the areas of advanced machine learning, language recognition using natural language processing (NLP), and image recognition through live example of machine learning. MLA are constructed to predict optimized output by giving training dataset inputs. In image recognition, an outcome model utilizing the existing reference model to predict DL-based AI prediction. Complex DL AI services is achieved by Bluemix sole power-driven Watson studio and Watson Assistant Service. Application programming interface keys are designated to connect Watson and Node Red Starter (NRS) to provide the web interface.
Chapter Preview
Top

Introduction

Artificial intelligence enables us to give decision making ability to a machine. In human point of view intelligence is human thinking and taking decisions based on the situation. Decisions are taken by previous experiences, learning, trial and error. For example, sensor say robot to move left or right based on the obstacle. Input and Intelligence joins together we can think intelligent, predict and do pattern recognition (Ghahramani, 2015; Fu, Geng-Shen, et al., 2019; Aluri, et al., 2019; Cavalcante et al., 2019). The basic functionalities remain the same but the way or application of it is evolving drastically. Figure 1, depicts the family tree of Artificial Intelligence division.

Figure 1.

Artificial Intelligence Family Tree

978-1-7998-5068-7.ch017.f01

Machine Learning is the brain child of analytics paradigm. Machine Learning Algorithm (MLA) uses the complex computational methodology to study data without the dependency of predetermined or pre-programmed equation as the base model. The sub-division of MLA are through supervised, unsupervised and reinforcement. Supervised Learning (SL) are task driven with well-defined goals and objective, which is to predict the subsequent value. Un-Supervised Learning (USL) are data drive by identifying the cluster or segment for prediction purpose. Reinforcement Learning (RL) are the real time and stable system where the system learns on its own through mistake i.e. the path and relationship is learnt on its own.

Goals of artificial intelligence include problem solving, machine learning, language, motion, and creativity. For deep learning, AI is the source for data in more angles. To process huge data, we need huge processor with high processing speed and computing power. In 1956 hard-disk size was big though it was only 5 MB. In 1980 10 megabyte hard-disk is used. In 2019, we use hard-disk of 400 GB which is of 84$. In all the scenarios from 1956 to 2019 hard-disk size and the cost kept on decreasing (Dubitzky et al., 2019). Cloud provides services like AI and ML services via Watson Assistant. Chat-bot is part of AI system required for chatting with intelligence robot as depicted in Figure 2. Using Watson assistant, we can build our own Chabot. It includes 3 layers presentation layer, machine learning layer, data layer. Presentation layer is where user interact with either the app or website.

Figure 2.

AI-ML Chat-bot Layer

978-1-7998-5068-7.ch017.f02

The three main layers of AI-ML Chatbot are Presentation Layer (PL), Machine Learning Layer (MLL) and Data Layer (DL). The prime responsibility of PL is delivery of information and formatting to the next application layer for additional processing and display. Encryption and Decryption are the prime duty of Presentation layer. For example, chatbot of Bank site or Facebook usually does the encryption and decryption while pushing sensitive details into the subsequent layers for further processing. DLL handles the movement of data from in and out of Physical or Hardware Layers. Three prime functions of data link layer deals with regulating the flow of data, combating transmission error and to deliver well defined interface boundary to subsequent network layer. Logical organisation of data, encapsulation and frame synchronisation are carried out in data link layer. The prime responsibility of MLL comprises of entities like Natural Language “Processing” (NLP) or Natural Language “Understanding” (NLU). NLP occurs when the computer reads the language i.e. text into structured data. NLU is narrow focus on making the computer to comprehend the exactly what the body of textual information mean to predict the tone, classification, sentiments and clarity of context understanding i.e. Alexa and Google Assistant. The concept of Natural Language Generation (NLG) is utilised when the computers perform the writing operation of language process i.e. turning structured data into text. NLG performs the exact inverse operation of NLU. AI-ML decision making Engine works on the collective functionalities of NLU and NLG.

Complete Chapter List

Search this Book:
Reset