An Experimental Approach and Monitoring Tools for Evaluating a Dynamic Cubing System

An Experimental Approach and Monitoring Tools for Evaluating a Dynamic Cubing System

Anne Tchounikine, Maryvonne Miquel, Usman Ahmed
Copyright: © 2016 |Pages: 19
DOI: 10.4018/IJDWM.2016100101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In this paper, the authors propose an approach and different tools to evaluate the performance and assess the effectiveness of a model in the field of dynamic cubing. Experimental evaluation, on one hand allows observing the behavior and the performance of the solution, while on the other hand it lets one compare the results with those of the other competing solutions. The authors' proposal includes an experimental workflow based on a set of configuration parameters to characterize the inputs (data sets, queries sets and algorithm input parameters) and a set of metrics to analyze and qualify the output (performance and behavior metrics) of the solution. They have identified a number of useful tools necessary to develop an experimental evaluation strategy. These monitoring tools allow elaborating the execution scenarios, collecting output metrics and storing and analyzing them online in real-time as well as later in off-line mode. Using a use-case model, the authors show that the framework and the proposed environment help carrying out a rigorous experimental evaluation of a dynamic cubing solution.
Article Preview
Top

Introduction

This article addresses the issue of experimental evaluation we faced during a research project within the field of real-time data warehousing. It presents an extended version of (Tchounikine 2015). We will try here to highlight how the use of a rigorous approach can promote experimental evaluations and complement academic evaluation performed with formal demonstration and qualification of models and algorithms. The concerned research project was conducted in the field of new agile BI applications, operating in fast evolving environments. In this earlier work, a solution is discussed in order to provide better data freshness and reduce analysis latency (Ahmed, Tchounikine, Miquel & Servigne, 2010). This solution allows on-the-fly insertions of facts and members by means of frequent atomic insertions, leading to fast aggregate updates. We provide the definition of a Dynamic Cube based on an un-ordered multidimensional and multi-level data space, enabling its evolution. A tree structure incrementally stores detailed data and aggregates for the densest regions of the data space thanks to a split strategy that promotes refinement of aggregates at increasing lower level. This proposal was implemented in a prototype that consists in a suite of tools starting from facts loading to OLAP navigation in the dynamic cube. This prototype was firstly aimed at functional testing and helped us to demonstrate the feasibility of the solution. In a second step, we used it to carry on experimental evaluation of this solution.

Experimental evaluations can be of two types. A first type of evaluations consists in observing the behavior and the performance of the solutions under some varying although tightly controlled circumstances: this helps in understanding its working, allows experimenting some formal postulates and provides feedback to improve the solution or tune some running parameters. A second type of evaluations consists in carrying out comparative studies and compares obtained results to those of competing solutions. Here again, it is necessary to tightly control the execution circumstances in order to ensure fair comparison.

We carried out these two types of evaluation using the prototype in context of our work on Dynamic Cubing. For this purpose, we followed an experimental approach based on a workflow that outlines the input parameters determining the execution context and experiments, the output metrics that are relevant for the evaluation of results, and a set of tools allowing the design of scenarios, tuning of inputs, collection and interpretation the outputs. We believe this experience can contribute to show how experimental evaluations can gain in being designed and adopted from the very beginning and maintained throughout a project.

This paper is organized as follows: The second Section summarizes our earlier contribution in the field of Dynamic Cubing and motivates our work on experimental evaluation. Next section lists and defines the input parameters used to customize the settings of an experimental evaluation. Section “Evaluation metrics” defines the metrics used to evaluate and observe the solution. Next section presents the tools used to perform experimentation and gives samples of experiment tracks and shows how the input parameters are set to reproduce use cases of test scenario, and how metrics are used to analyze them. We end with related work and conclude.

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 6 Issues (2023)
Volume 18: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing