Whose Questionnaire is It, Anyway?

Whose Questionnaire is It, Anyway?

Andrew Saxon, Shane Walker, David Prytherch
DOI: 10.4018/jitwe.2009100101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter focuses on the adoption and adaptation of methodologies drawn from research in psychology for the evaluation of user response as a manifestation of the mental processes of perception, cognition and emotion. The authors present robust alternative conceptualizations of evaluative methodologies which allow the surfacing of views, feelings and opinions of individual users producing a richer, more informative texture for user centered evaluation of software. This differs from more usual user questionnaire systems such as the Questionnaire of User Interface Satisfaction (QUIS). (Norman et al, 1989) The authors present two different example methodologies so that the reader can firstly, review the methods as a theoretical exercise and secondly, applying similar adaptation principles, derive methods appropriate to their own research or practical context.
Article Preview
Top

Introduction

Viewed from a design perspective, there appears to be a lack of empirical research investigating the determinants of important aspects of behavior such as emotion and motivation, and how an understanding of these may influence designers’ decisions in the software evaluation process. The ubiquitous nature of information technology today means that the computer is no longer just a tool for those who are compelled to use it, or have to learn to use it, as was the case in the 1980s. Interfaces, in particular on the Internet, must appeal to a broad base of users with varying levels of skill and ability, and should work first time to ensure the user is not ‘put off’ the experience. Aesthetic considerations may also be considered significant in this context. (Hartmann, Sutcliffe & De Angeli, 2007) Modern psychological theories on motivation e.g. Ford, (1992) agree on a basic structure of component processes: goal directed activity, an individual’s belief in their skills and the context within which they will work, and finally their emotions.

Motivation is a rather abstract term that historically has challenged psychologists to provide satisfactory definitions. Unified theories that attempt to satisfactorily explain human motivation have been developed only relatively recently and research on motivation within HCI such as the Technology Acceptance Model, (TAM) (Davis, 1989) supports the argument that visual communication and functionality (perceived ease of use) influence users’ motivation, and change user behavior in a way that impacts on usability. Research has shown that highly motivated users experience less anxiety, have higher perceptions of self-efficacy and more positive attitudes towards the software. (Davis, ibid.)

In order to assess how far design techniques applied to the user interface can harmonize with psychological needs for optimal performance on specific tasks and attainment of goals, we must base questions on a fundamental understanding of key influencing variables of the interaction process, together with a clear knowledge of their relative importance to the individual user. In perceptual terms, interactive computer systems are not just representations of knowledge, but interactive experiences that should seek to fully exploit the user’s senses and emotions, developing new ways to deliver effective communication.

Variables during interaction that can influence user motivation lie in the gulf between executing the task and its evaluation. The users evaluate their goals, their own ability to attain them and the potential of the context, (in this case the computer system) to support them in this activity. Evaluation is on-going as perception is regularly matched against expectations and is a good indicator of how successful the interface is. This gulf may be bridged by addressing issues from either direction, the computer or the user. The system designer can bridge such issues by creating interaction mechanisms that better match the psychological needs of the user as evidenced by the task model.

We present two different examples, describing tested methodologies for addressing these needs, though many other comparable adaptations of different domain methodologies might be similarly useful.(e.g. Greenberg et al, 2000; Hollan, Hutchins & Kirshac, 2000; Duric, et al, 2002),

The first example is derived from Motivation Systems Theory (MST) (Ford, 1992) wherein components of human behavior are modeled as simple behavioral processes. MST integrates the conceptual frameworks of 32 motivational theories around its core concepts and was also found to compare well with models and theories already used to describe user interaction in HCI.

The second example is derived from Kelly’s (1955) Repertory Grid Technique. This method uses a highly qualitative approach to the surfacing of a user’s experience using his/her own frame of reference. The technique is highly amenable to customization by the experimenter, to suit the particular needs of his/her investigation, and details on customization carried out by the authors are described.

In order to prove these derived methodologies, they were used as part of a suite of usability tests that were run on Webomatic, a website design application aimed at UK Small and Medium sized Enterprises (SMEs) which itself was one of the outcomes of an earlier European Union Regional Development Fund (ERDF) part-funded project to investigate the implications of design for e-commerce (Figure 1).

Complete Article List

Search this Journal:
Reset
Volume 19: 1 Issue (2024)
Volume 18: 1 Issue (2023)
Volume 17: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 16: 4 Issues (2021)
Volume 15: 4 Issues (2020)
Volume 14: 4 Issues (2019)
Volume 13: 4 Issues (2018)
Volume 12: 4 Issues (2017)
Volume 11: 4 Issues (2016)
Volume 10: 4 Issues (2015)
Volume 9: 4 Issues (2014)
Volume 8: 4 Issues (2013)
Volume 7: 4 Issues (2012)
Volume 6: 4 Issues (2011)
Volume 5: 4 Issues (2010)
Volume 4: 4 Issues (2009)
Volume 3: 4 Issues (2008)
Volume 2: 4 Issues (2007)
Volume 1: 4 Issues (2006)
View Complete Journal Contents Listing