Evaluating the Effectiveness of Static Analysis Programs Versus Manual Inspection in the Detection of Natural Spreadsheet Errors

Evaluating the Effectiveness of Static Analysis Programs Versus Manual Inspection in the Detection of Natural Spreadsheet Errors

Salvatore Aurigemma, Ray Panko
Copyright: © 2014 |Pages: 19
DOI: 10.4018/joeuc.2014010103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Spreadsheets are widely used in the business, public, and private sectors. However, research and practice has generally shown that spreadsheets frequently contain errors. Several researchers and vendors have proposed the use of spreadsheet static analysis programs (SAPs) as a means to augment or potentially replace the manual inspection of spreadsheets for errors. SAPs automatically search spreadsheets for indications of certain types of errors and present these indications to the inspector. Despite the potential importance of SAPs, their effectiveness has not been examined. This study explores the effectiveness of two widely fielded SAPs in comparison to manual human inspection on a set of naturally generated quantitative errors in a simple, yet realistic, spreadsheet model. The results showed that while manual human inspection results for this study were consistent with previous research in the field, the performance of the static analysis programs at detecting natural errors was very poor for every category of spreadsheet errors.
Article Preview
Top

Introduction

Decades of research on spreadsheets have resulted in several commonly accepted tenets. Spreadsheets are widely used internationally in manufacturing (Schwartz, 2005) education (Wagner, 2003), business (Chan & Storey, 1996), and government (Butler, 2000). Spreadsheets are used for modeling problems ranging from trivial to epic, from simple calculations to incredibly complex amalgamations spanning multiple applications (Panko & Port, 2013; Warfield & Hihn, 2009) and tens of thousands of calculations (Panko & Port, 2013; Powell, Baker & Lawson, 2009).

Given the widespread use of spreadsheets, errors are a potential concern. Since 1995, nine studies have inspected operational spreadsheets to look for errors (Panko, 2013). These studies found errors in 84% of the 163 spreadsheets they inspected. Among inspection studies that only reported errors if they were serious, the percentage of incorrect spreadsheets was actually higher, 91% (Panko, 2013). Although much research has been conducted on spreadsheet errors, there is no consensus on how to prevent, detect, and deal practically with these errors (Clermont & Mittermeir, 2002; Panko, 2007; Croll, 2003; Powell, Baker & Lawson, 2008; Butler, 2000; Rajalingham, Chadwick, Knight & Edwards, 2000).

One way to reduce errors is to manually inspect spreadsheets. Unfortunately, humans are only partially effective in detecting spreadsheet errors, even when using aggressive error detection techniques. In eight laboratory experiments collectively using 982 participants, the participants only discovered 63% of all seeded errors (Panko, 2010). In comparison, manual code inspection in software testing experiments has shown similar detection rates (Panko, 1999). Not only are spreadsheets consistently developed with errors and people are generally poor at detecting the errors, developers are overconfident in both their ability to build error-free models and detect errors (Panko, 1999; Howe & Simkin, 2006; Davis & Ikin, 1987; Reithel, Nichols & Robinson, 1996).

Complete Article List

Search this Journal:
Reset
Volume 36: 1 Issue (2024)
Volume 35: 3 Issues (2023)
Volume 34: 10 Issues (2022)
Volume 33: 6 Issues (2021)
Volume 32: 4 Issues (2020)
Volume 31: 4 Issues (2019)
Volume 30: 4 Issues (2018)
Volume 29: 4 Issues (2017)
Volume 28: 4 Issues (2016)
Volume 27: 4 Issues (2015)
Volume 26: 4 Issues (2014)
Volume 25: 4 Issues (2013)
Volume 24: 4 Issues (2012)
Volume 23: 4 Issues (2011)
Volume 22: 4 Issues (2010)
Volume 21: 4 Issues (2009)
Volume 20: 4 Issues (2008)
Volume 19: 4 Issues (2007)
Volume 18: 4 Issues (2006)
Volume 17: 4 Issues (2005)
Volume 16: 4 Issues (2004)
Volume 15: 4 Issues (2003)
Volume 14: 4 Issues (2002)
Volume 13: 4 Issues (2001)
Volume 12: 4 Issues (2000)
Volume 11: 4 Issues (1999)
Volume 10: 4 Issues (1998)
Volume 9: 4 Issues (1997)
Volume 8: 4 Issues (1996)
Volume 7: 4 Issues (1995)
Volume 6: 4 Issues (1994)
Volume 5: 4 Issues (1993)
Volume 4: 4 Issues (1992)
Volume 3: 4 Issues (1991)
Volume 2: 4 Issues (1990)
Volume 1: 3 Issues (1989)
View Complete Journal Contents Listing