Article Preview
Top1. Introduction
There is a mandatory requirement to provide the accuracy, transparency and continuity of data to prove authentic performance of the system for any regulated environment. In the context of reliability measurement, the potential of data and analytics are two most important components. Measuring the data quality has been defined by Wang and Strong (1996) is a set of attributes with different data quality rules like completeness, conformity, validity and accuracy. Subsequently, Gartner (2012) proposed an analytics ascendancy model that shows how the importance of analytics increases, if the capabilities of system are expended. Conventionally, preparatory analytics, descriptive analytics and diagnostic analytics are common analytics method to perform reliability measurement as illustrated by A. Bilal et al. (2018); N. Pezzotti et al. (2018) and S. Liu et al. (2017). In addition to cause related analytics, predictive analytics and reliability bases analytics are also preferred. They can be described as combinatorial processes:
- •
Preparatory Analytics: This type of analytics can also be called as cross-examination of data and is useful to evaluate existing DQ (data quality) levels for variables/CDEs. Techniques like DQ business rules, DQ rule evaluation, and statistical process control are useful to assess the DQ levels.
- •
Descriptive Analytics is responsible to investigate about particular process, operation, facility or CDE (critical data element). This type of analytics should be performed by using the tools like data mining, basic profiling, and descriptive statistics. This type of analytics helps us to understand the performance at a given point of time by providing a snapshot with means and standard deviations.
- •
Diagnostic Analytics: Diagnostic analytics are performed to know when, where, why, and how a particular problem has occurred. Techniques like correlation analysis, hypothesis test, analysis of variance (ANOVA), and control charts are typically used in this type of analytics.
Considering the relevance of application fault, the code of web application is to be customized. Martin Fowler (2009) introduced Code refactoring techniques which can format code with a precise methodology and structural programming that helps to identify the piece of error-prone code. However, in a recent study by Josua Krause et al. (2016), visual analytics could be one of the immediate supports for machine learning perspectives. There are good numbers of research opportunities concerning visual analytics and machine learning with respect to analyzing software reliability. We identify two generic modalities that can be used for model interpretation with visual analytics: Visualizing Model Structure (White-Box) - for transparent models, e.g., decision trees, one option is to use visualization to represent the structure built through the training method. Several examples exist in this area, especially for decision trees and rules. On the other hand, Visualizing Model Behavior (Black-Box) will use visualization as a way to look at the behavior of the model by looking exclusively at the relationship between input and output.
Very recently, a novel visual analytics system design named as Prospector has been coined by A. Bilal et al. (2018). This tool helps analysts better understand predictive models (Krause et al., 2016). Prospector aims to support data scientists to go beyond judging predictive models solely based on their accuracy scores by also including model interpretability and actionable insights.
(1) where,
is the number of rows in the input matrix
,
is the prediction function that takes one input row, a feature vector, and returns a prediction score, and f is the feature used to compute the partial dependence plot. The formula computes the average outcome over all input rows, while changing the value of feature f to the input value v for each row x
i. Hence, reliability measures of web application are gaining an importance towards the implicit measure of reliability. There are different strategies in web application, where, under real-time environment key product workflow is applicable to ensure trusted results, with satisfiable load tests.