The Magic Bullet: A Tool for Assessing and Evaluating Learning Potential in Games

The Magic Bullet: A Tool for Assessing and Evaluating Learning Potential in Games

Katrin Becker
Copyright: © 2011 |Pages: 13
DOI: 10.4018/ijgbl.2011010102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This paper outlines a simple and effective model that can be used to evaluate and design educational digital games. It also facilitates the formulation of strategies for using existing games in learning contexts. The model categorizes game goals and learning objectives into one or more of four possible categories. An overview of the model is provided and the four categories are defined. The model is used to analyze several games. The implications that this model has for the design and use of games as instructional technologies are then described.
Article Preview
Top

Introduction

We have been using media to augment and enhance learning interventions since our very beginnings. Stories, among the oldest instructional technologies known to man, came to life around the campfire with the skillful use of the teller’s voice, sound effects, body movements and sometimes props and firelight – in other words, using an early form of communication media. Ever since we started to examine learning in a more formal way we have struggled to find effective ways to assess the value and efficacy of the technologies we use to deliver instruction. Many of the methodologies we employ look at the learner (Dick, Carey, & Carey, 2001; Pirnay-Dummer, Ifenthaler, & Spector, 2010; Sims, 2006), and this is very important, but as the design of instructional objects becomes more complicated and more expensive, it also becomes important to have ways of evaluating the object itself. It is useful to be able to assess a learning object while it is still in the design stages, and with more and more ready-made objects available it is useful to have a methodology that can be used to create the short list of candidates when one is trying to choose among many options. Although there is no shortage of resources on how to design and build digital educational applications, approaches to evaluation of the same are far less plentiful (Schleyer & Johnson, 2003). This is especially true of interactive objects.

Videogames are among the most highly interactive digital media currently known, and this sets them apart from other media. In fact, games are distinct from all other digital and mass media. They share qualities with many other media forms to be sure, but they also have other qualities that set them apart (Egenfeldt-Nielsen, 2004). A key aspect of games is that people proceed in games by doing things, and this experiential quality lies at the very core of game design. A game is not a game if there is no interaction – in other words the environment must change as a result of player actions, and videogames are popular precisely because of the experience they provide. Games designed for learning can do no less. Thus, any epistemology of games for learning must begin with the experience (Squire, 2006).

Formative and summative evaluation of instructional materials are essential elements of the instructional design process, but when it comes to software and especially digital games and simulations, the ability to evaluate the software itself before it is used in a real situation is essential. Evaluations and reviews of software do exist, but they often say little about what and how the students will learn (Kafai, Franke, & Battey, 2002). This paper outlines a simple yet effective model that can be used to help in the evaluation of existing games and in the design of new digital games for educational purposes. Further, this model can help educators formulate strategies for using existing games within a learning context. The first part of this paper provides an overview and explanation of the model and the second part will use the model to analyze several popular commercial and educational digital games. Implications for game-based learning in formal settings are discussed at the conclusion.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024)
Volume 13: 1 Issue (2023)
Volume 12: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 4 Issues (2016)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing