Towards a Living Systematic Review: Updating Trends of Psychometric Analysis Evidence in Game-Based Assessments

Towards a Living Systematic Review: Updating Trends of Psychometric Analysis Evidence in Game-Based Assessments

Copyright: © 2023 |Pages: 36
DOI: 10.4018/979-8-3693-0568-3.ch002
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Calls for examining the psychometric properties of games as assessments have continued to circulate in subsequent years. Therefore, the current authors posit that the foundation created by de Klerk and colleagues should be treated as a living systematic review and updated with the most current research. Here, the initial step is taken in updating de Klerk and colleagues' systematic review by asking the research question: What psychometric trends have emerged surrounding the analysis of performance data in game-based assessments within the current educational research? First, the current authors update the systematic review to cover the intervening years (2014 - 2021). Second, the current authors examine the trends found in de Klerk and colleagues' work and the updated findings. As an extension to de Klerk and colleagues' analysis, the current authors also evaluated sources of validity evidence and, if present, the framework used in each article to align the in-game data to the measured knowledge, skill, or ability (KSA).
Chapter Preview
Top

Introduction

Video games have become synonymous with learning (Steinkuehler, Squire, & Barab, 2012). Prensky (2001) seminally postulated the theory of digital game-based learning (DGBL); that games were fun and engaging, and fundamentally addressed the change(s) in 21st century learning and the way in which learners think and behave. Since, scholars have vehemently argued that video games are positive learning tools both in and out of educational settings (Squire, 2012). Naturally, researchers began to examine games as assessments. The field of game-based assessment (GBA) provided multiple methods (Shute & Ke, 2012) and analytic techniques (McCreery et al., 2019) for assessing learning within games. Assessment is defined by its purpose, feedback, and production of evidence of some knowledge, skill, or ability (KSA) (Messick, 1994, 1995; Scriven, 1966). Further, the psychometric properties of validity, reliability, and fairness are fundamental principles of assessments, both traditional and game-based (Messick, 1994; Mislevy, Behrens, Dicerbo, Frezzo, & West, 2012). However, empirical evidence for supporting psychometrically sound game-based assessments are scant. Shute and Ventura (2013) outlined the issues surrounding traditional assessment psychometrics, for example, and how such properties can suffer similar issues within GBA unless specifically addressed in both research and practice. Shute’s (2011) work in stealth assessments (embedded within a digital learning environment and invisible to the learner) attempted to reconcile psychometric issues via evidence-centered design (ECD) (Mislevy, Steinberg, & Almond, 2003). ECD is an extension of Messick’s (1994) assessment framework for formative and summative evaluations (also see Black & Williams, 1998). Ren (2019) further explicated the importance of design, analysis, and interpretation of gameplay data when examining GBA, and stressed that psychometric properties should be deliberately embedded within the gaming environment a priori; something much of the field still lacks (see Ifenthaler, Eseryel, & Ge, 2012; Ifenthaler & Kim, 2019).

Although GBA, stealth assessment, and broader game-based studies have provided knowledge on the design, implementation, and interpretation of game-based performance data, it is clear throughout the literature that psychometric properties of games as assessments lack cohesion beyond the claims of games as beneficial learning environments (Oranje, Mislevy, Bauer, & Jackson, 2019). Thus, it is imperative that researchers continue to examine the psychometric properties of games in order to develop accurate assessment measures for game-based learning (Ke & Shute, 2015). de Klerk, Veldkamp, & Eggen (2015) addressed this call by conducting a systematic review on the psychometric analysis of the performance data of simulation and game-based assessments in order to provide an understanding of the psychometric trends surrounding games as assessments. A systematic review is a method of comprehensively identifying, appraising, summarizing, and synthesizing all the relevant studies on a given topic in order to inform policy and practice (Petticrew & Roberts, 2008).

Critically, as a static document, systematic reviews are limited to the evidence existing when data were collected. To address this limitation, systematic review updates should be routinely conducted (Elliott et al., 2014). Hence, calls for examining the psychometric properties of games as assessments have continued to circulate in subsequent years (see: Frey, 2018; Ifenthaler & Kim, 2019). There is a need in the field to update de Klerk and team’s (2015) work.

Complete Chapter List

Search this Book:
Reset