User Page Reviews in Usability Testing

User Page Reviews in Usability Testing

Leo Lentz (Utrecht University, The Netherlands) and Sanne Elling (firMM Information + Service Design, The Netherlands)
DOI: 10.4018/978-1-4666-5129-6.ch007
OnDemand PDF Download:
List Price: $37.50


Websites increasingly encourage users to provide comments on the quality of specific pages by clicking on a feedback button and filling out a feedback form. The authors investigate users’ (N=153) abilities to provide such feedback and the kind of feedback that is the result. They compare the results of these so called user page review methods with the concurrent think-aloud method, applied on the same Websites. Results show that it is important to keep feedback tools both simple and attractive so that users will be able and willing to provide feedback. The authors also find that the number of problem detections is higher in the review condition, but the two methods seem to be highly complementary. An analysis of the detections from a practice-oriented perspective reveals that the overlap between the two methods is rather high and that reviewing participants seem capable of signalling important problems that are also exposed in a think-aloud study.
Chapter Preview


Organizations increasingly recognize the importance of giving the user a voice, and many websites contain a feedback option that invites users to comment on various aspects of the website. In this chapter, we focus on evaluation methods that enable users to give feedback on specific pages of a website. Such methods have received little attention in the literature about website evaluation methods, and there is no generally accepted term for this type of method yet. These methods, however, can be categorized as self-reported metrics (Tullis & Albert, 2008) because, as in surveys, users are asked about their experiences on the website. But unlike surveys, the methods we focus on ask for page-level feedback and allow for open comments, sometimes combined with an overall rating or some scale questions. We propose to call these methods user page reviews (UPR) because they invite users to review a website by clicking on a button that appears on selected pages. In such reviews, users evaluate a website in much the same way as experts do (Welle Donker-Kuijer, De Jong, & Lentz, 2008). Of course, users cannot be expected to have professional expertise about Web design, but they may be able to provide useful feedback from their own user perspective about their attitudes and experiences with the website. These comments might shed light on user problems that experts fail to diagnose because of their curse of expertise (De Jong & Lentz, 1996).

Tools that can be used for gathering user feedback on website pages include Opinionlab, Kampyle, Usabilla, and Infocus. These instruments enable website visitors to share their opinions on everything they consider important. Selected website pages (or sometimes all pages) contain a button that users can click on if they want to react to something. These buttons can be small icons (e.g., a thumb or a plus–minus icon with the word feedback) or longer text links that invite reactions to the page. Users click on the link to open a screen on which they can provide their comments. Users can give open-ended comments, but the instruments often also ask users to choose a feedback category, provide ratings, or answer questions about page-specific topics. Figures 1-4 show screen shots of feedback forms from Opinionlab, Kampyle, Infocus, and Usabilla, respectively.

Figure 1.

Opionlab feedback form

Figure 4.

Usabilla add-note option and invitation to users to click on elements they like

The Opinionlab form in Figure 1 looks rather dense and asks users to complete several tasks: to choose a topic from predefined categories, enter an open comment, rate the page on three aspects as well as overall, enter an e-mail address (which is optional), and indicate whether or not their comment is about the website. The Kampyle form in Figure 2 uses icons that users select to express their feelings and categorize their feedback. It asks users to rate the site by choosing an emoticon, to select a feedback topic (under the topics that are visible in the figure are rows with subtopics), and to fill in an open comment. The Infocus form (see Figure 3) includes a list of predefined categories, a place to formulate comments, and three options for marking specific elements or segments: Users can underline, point an arrow at, or draw a frame around a relevant section that they want to comment on. This marking function is different from most other tools, in which users have to describe the exact location of the object of their feedback. Only Usabilla (see Figure 4) offers a form of marking that enables users to add a note on the web page. But it is not possible to mark the exact size of the selection or to use other more precise markings. Besides these four feedback tools, many examples of feedback buttons can be found on websites.

Figure 2.

Kampyle feedback form

Complete Chapter List

Search this Book: