An Effective Approach for Evaluating Usability of Web Sites

An Effective Approach for Evaluating Usability of Web Sites

Emel Kizilkaya Aydogan, Elif Kilic Delice, Petraq Papajorgji
DOI: 10.4018/978-1-4666-3946-1.ch008
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

University Websites have been one of the important Websites for students, university personnel, and researchers. However, usability has long been a major concern within Websites. This chapter develops an evaluation approach using usability criteria and the Fuzzy Analytic Hierarchy Process (FAHP) for the comparison of Websites in a fuzzy environment. Firstly, alternative Websites are examined by experts using usability criteria in the literature. Then, FAHP is used to determine the usability criteria weights from subjective and imprecise judgments of the experts. Finally, Websites are compared based on usability criteria weights.
Chapter Preview
Top

1. Introduction

Today people increasingly expect more from the functionality of a Web site, so the Web usability has emerged as an important topic. Users pay more attention to ease of use property of Web site so as any other product. However, there is an increasing importance for higher usability in the Web development industry and communities. Therefore, the designers aim for Web site with high usability (Delice and Gungor, 2009).

Usability not a single property, but a combination of several properties and attributes (Liljegren, 2006). Therefore, there are many definitions of usability in the literature (Constantine and Lockwood, 1999; Hix and Hartson, 1993; Shackel, 1991; Nielsen, 1993; Preece et al., 1994; Shneiderman, 1986; Wixon and Wilson, 1997). Those definitions have resulted in different definitions of usability, because authors have different opinions on how to measure usability. However, there are many evaluation approaches and techniques are based upon specific definitions of usability. In the expert judgment-based approach which is one type of evaluation approach, experts mostly rely on their experience to make a judgment on the ergonomic quality of alternative interfaces (Folmer and Bosch, 2004). In case of lacking experience, they appraise alternatives with user interface design guidelines (Nielsen, 1993; Smith and Mosier, 1986; Shneiderman, 1992), established human factors principles and standards (e.g., ISO, ANSI, DIN, etc.), and criteria (Ravden and Johnson, 1989; Scapin, 1990). In view of the multiplicity of criteria inherent in such decision-making situations, the methodology of Multiple-Criteria Decision Making (MCDM) is used as the framework of analysis (Park and Lim, 1999). MCDM is a powerful tool used widely for evaluating and ranking problems containing multiple, usually conflicting criteria.

In the usability literature, Analytic Hierarchy Process (AHP), has been widely used MCDM technique, is generally used to weight the usability criteria and evaluate the interfaces based on weights of these criteria. (Park and Lim, 1999) tried to select an interface among alternatives by using the usability criteria and measures. The selection of these interfaces took place in two steps. In the first step, AHP was used to weight the usability criteria with expert based evaluation and some of the interfaces were eliminated. In the second one, usability testing was performed. In another study by (Mitta, 1993), computer interfaces were evaluated by AHP based on usability and learning-ability criteria, and a selection was made among alternative computer interfaces. The study, thus, shows that AHP can be applied to subjective data which are obtained from studies on human factors. (Ji et al., 2007) developed an AHP evaluation model for haptic interface design with quantitative assessments of the prototype by finding out the absolute and relative importance for evaluation groups and factors in early design levels. Finally, different from previous studies, (Delice and Gungor, 2009) used AHP to prioritize the usability problems which were defined by Heuristic Evaluation (HE). In this study, first, usability problems are detected by HE method and then severity ratings of these problems are evaluated by AHP. As a result, HE is arranged by using AHP.

Complete Chapter List

Search this Book:
Reset