Advances in Assessment of Students' Intuitive Understanding of Physics through Gameplay Data

Advances in Assessment of Students' Intuitive Understanding of Physics through Gameplay Data

Mario M. Martinez-Garza, Douglas Clark, Brian Nelson
DOI: 10.4018/ijgcms.2013100101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In this paper, the authors present advances in analyzing gameplay data as evidence of learning outcomes using computational methods of statistical analysis. These analyses were performed on data gathered from the SURGE learning environment (Martinez-Garza, Clark, & Nelson, 2010). SURGE is a digital game designed to help students articulate their intuitive concepts of motion physics and organize them toward a more normative scientific understanding. Various recurring issues of assessment, which pervade assessment of learning in games more generally, prompted the authors to consider whether gameplay (actions of learners in the context of the game) can be analyzed to produce evidence of learning. The authors describe their approach to the analysis of game play in terms of qualitative assessment that the authors believe may lay the groundwork for the application of similar computationally-intensive techniques in other educational game contexts.
Article Preview
Top

Theoretical Framework

The potential of video games to support science learning is generally agreed upon (Gee, 2007; Mayo, 2009; Squire et al., 2003), but the analysis and structuring of evidence for game-based learning remains a challenge. This, in turn, has supported a mixed view of the effectiveness of games as tools for learning (Foster & Mishra, 2008; O’Neil, Wainess, & Baker, 2005). We believe, however, that this conclusion may be premature. The past fifteen years have seen great advances both in the sophistication of game designs and also in the supporting technology; there simply has not been enough time for a commensurate evolution in appropriate research methods. One central methodological difficulty involves capturing and measuring game-induced learning, which tends to be strongly situated within the game context, in out-of-game contexts such as post-tests. More advanced game designs compound this problem by supporting complex player actions that are challenging for learners to summarize and express, difficult for instruments to reliably capture, and resistant to conventional analytical methods. In addition, the use of formal assessments alongside games can compromise a game’s capacity for engagement and immersion, thus potentially reducing the efficacy of both the learning experience and the assessment.

The use of assessments of learning which reside outside a game used to measure learning that happens inside a game presents issues and vulnerabilities that merit careful consideration. Assessment is, after all, not a neutral activity. All assessments carry assumptions about the nature of learning, the nature of knowledge, and the purpose of assessment itself (Willis, 1993). The action of assessment places premiums on certain forms of knowing and understanding while de-emphasizing others. In the case of games for learning science, for example, an assessment may privilege declarative forms of knowledge, e.g. definitions and abstract principles, while the game itself might be more productive in reinforcing tacit knowledge or qualitative understanding of relationships. This insight becomes even more salient given the contrast between different types of games for learning: those in which the curriculum concepts are embedded in the game environment in a manner such that the game environment is structured mainly as context (“conceptually-embedded” games) and those in which the material to be learned is integrated into the core game-play mechanics with which the player is in constant interaction (“conceptually-integrated” games) (Martinez-Garza, Clark, & Nelson, 2012). It follows that these two kinds of games would favor different assessment strategies, given the differences in how they engage the learner, how they gauge success in the game, and how they represent knowledge. These nuances are not necessarily well captured by traditional assessments of learning, which traditionally favor summative declarations of concepts, articulated in discipline-specific forms and language (Sutton, 1996; Fang, Lamme & Pringle, 2010).

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024)
Volume 15: 1 Issue (2023)
Volume 14: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing