Analyzing User Behavior in Digital Games

Analyzing User Behavior in Digital Games

Anders Drachen, Alessandro Canossa
Copyright: © 2012 |Pages: 28
DOI: 10.4018/978-1-60960-774-6.ch001
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

User research in digital game development has in recent years begun to expand from a previous existence on the sideline of development, to a central factor in game production, in recognition that the interaction between user and game is crucial to the perceived user experience. Paralleling this development, the methods and tools available for conducting user research in the industry and academia is changing, with modern methods being adopted from Human-Computer Interaction (HCI). Ubiquitous tracking of player behavior and player-game interaction forms one of the most recent additions to the arsenal of user-research testers in game development and game research. Player behavior instrumentation data can be recorded during all phases of game development, including post-launch, and forms a means for obtaining highly detailed, non-intrusive records of how people play games. Behavioral analysis is a relatively recent adoption to game development and research. However, it is central to understanding how games are being played. In this chapter, the current state-of-the-art of behavior analysis in digital games is reviewed, and a series of case studies are presented that showcase novel approaches of behavior analysis and how this can inform game development during production. The case studies focus on the major commercial game titles Kane & Lynch: Dog Days and Fragile Alliance, both developed by IO Interactive/Square Enix Europe.
Chapter Preview
Top

Introduction

Computer games have evolved from simple text-based adventures like Colossal Cave and Akalabeth to virtually photo-realistic renditions of virtual worlds with advanced mechanics, spreading across a dozen or more genres, offering an increasing number of entertainment opportunities (Bateman & Boon, 2005). This development is to no small degree driven by the evolution of gaming devices, the hardware platforms upon which games software is loaded, are also becoming more and more diverse, and thanks to the increasing connectedness of e.g. mobile networks, users are finding digital games accessible everywhere. The increased complexity of digital games – in terms of the amount of possible user actions and –behaviors that they afford, as well as the breath of interaction options between the user and the software/hardware –, the diversity and the distribution across different hardware devices (Lazzaro & Mellon, 2005; Mellon, 2009; Pagulayan, Keeker, Wixon, Romero & Fuller, 2003), are among the important factors driving an increased focus on the users, the players, of digital games, in the game development industry. Contemporaneously with the development in game design, user-research and user-oriented testing has become progressively more important to industrial development and quality assurance (Kim et al., 2008; Pagulayan, Keeker, Wixon, Romero & Fuller, 2003). The purpose of user-oriented game testing is to evaluate how specific components of, or the entirety of, a game is played by people; allowing designers to evaluate whether their ideas and work provides the experience they are designed for. User-oriented testing is useful in game production, because the perceived quality of a digital game product is generally related to the perceived user experience. Therefore, content testing is receiving increasing attention from industry and academia alike (e.g. Isbister & Schaffer, 2008; Jørgensen, 2004; Kim et al., 2008; Nørgaard & Rau, 2007).

Methods adopted from Human-Computer Interaction (HCI) (Hilbert & Redish, 1999; Kuniavsky, 2003) have begun to replace the traditional informal testing approaches used in game development and game research, with e.g. usability, playability and user behavior forming keywords in contemporary user-oriented testing and –research (Davis, Steury & Pagulayan, 2005; Isbister & Schaffer, 2008; Medlock, Wixon, Terrano, Romero & Fulton, 2002; Pagulayan, Keeker, Wixon, Romero, Fuller, 2003). Different methodological approaches have different weaknesses and strengths, with e.g. qualitative approaches being excellent for acquiring in-depth feedback from players (users) but requiring substantial resources. In comparison, quantitative approaches are generally better suited for larger participant groups, but less suited for in-depth analysis or study of user behavior and –experience. Given the limited resources of industrial testing, a considerable focus has therefore been aimed towards the quantitative methods.

Complete Chapter List

Search this Book:
Reset