A Framework for Structuring Learning Assessment in a Massively Multiplayer Online Educational Game: Experiment Centered Design

A Framework for Structuring Learning Assessment in a Massively Multiplayer Online Educational Game: Experiment Centered Design

Shawn Conrad, Jody Clarke-Midura, Eric Klopfer
Copyright: © 2014 |Pages: 23
DOI: 10.4018/IJGBL.2014010103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Educational games offer an opportunity to engage and inspire students to take interest in science, technology, engineering, and mathematical (STEM) subjects. Unobtrusive learning assessment techniques coupled with machine learning algorithms can be utilized to record students' in-game actions and formulate a model of the students' knowledge without interrupting the students' play. This paper introduces “Experiment Centered Assessment Design” (XCD), a framework for structuring a learning assessment feedback loop. XCD builds on the “Evidence Centered Assessment Design” (ECD) approach, which uses tasks to elicit evidence about students and their learning. XCD defines every task as an experiment in the scientific method, where an experiment maps a test of factors to observable outcomes. This XCD framework was applied to prototype quests in a massively multiplayer online (MMO) educational game. Future work would build upon the XCD framework and use machine learning techniques to provide feedback to students, teachers, and researchers.
Article Preview
Top

Introduction

Open-world games like massively multiplayer online role playing games (MMORPGs) encourage exploration and experimentation. In these environments, learning is situated in problem spaces that involve hypothesizing, probing, observing, reflecting, and recycling these steps (Gee, 2003). The open-world allows players the freedom to move and act freely within the game environment, instead of following predefined paths and action sequences (Blizzard Entertainment Inc., 2012). While research has documented how such games can be used to engage and inspire students to take interest in science, technology, engineering, and mathematical (STEM) subjects (Steinkuehler & Duncan, 2008), the field is beginning to explore how they can be used for assessment. The extended capabilities provided in MMORPGs allow for a new, innovative approach to assessment. Unlike traditional assessments, which rely on students providing itemized feedback, assessment through MMORPGs can be captured in-situ, during game play. In this paper, we will describe how unobtrusive learning assessment techniques coupled with machine learning algorithms can be utilized to record a student’s in-game actions and formulate a model of the student’s knowledge without interrupting the student’s game play. We introduce “Experiment Centered Assessment Design” (XCD), a framework for structuring a learning assessment feedback loop. XCD builds on the “Evidence Centered Assessment Design” (ECD) approach (Mislevy & Haertel, 2006), which uses tasks to elicit evidence about a student and his learning. XCD defines every task as an experiment in the scientific method, where an experiment maps a test of factors to observable outcomes. This XCD framework was applied to prototype quests in an educational MMORPG, The Radix Endeavor, being developed at The Education Arcade at the Massachusetts Institute of Technology. In the following sections, we provide background and context by first describing The Radix Endeavor. We then present an overview of learning assessment through Evidence Centered Design. Next, we describe the Experiment Centered Design assessment framework. Then we provide examples of Experiment Centered Design used in The Radix Endeavor quests. Finally, we conclude with further ideas to expand Experiment Centered Design.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024)
Volume 13: 1 Issue (2023)
Volume 12: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 4 Issues (2016)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing