Game Play Differences by Expertise Level in Dota 2, A Complex Multiplayer Video Game

Game Play Differences by Expertise Level in Dota 2, A Complex Multiplayer Video Game

Lisa Castaneda, Manrita Kaur Sidhu, Jonathan J. Azose, Tom Swanson
DOI: 10.4018/IJGCMS.2016100101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Dota 2, a complex team based video game, was used to study expertise and attentional allocation in a multiplayer online battle arena (MOBA) setting. Pre- and post-play survey questions and eye-tracker data were collected from 67 video game players during a session of Dota 2 play. Questions explored abstract versus concrete conceptualizations of game-play and individual versus team focus. Quantitative eye-tracker data was evaluated for differences in visual attention and scan patterns. The authors noted that novices reflected on more concrete game elements and were likely to look back at the same location twice in a row. There was no difference among player categories in amount of time looking at mini-map or in self vs. team focus; however, experts were more able to reflect on abstract game concepts. Expert-novice differences in this study are similar to expertise research findings from other domains. The qualitative and unique quantitative metrics that can be gathered from complex games may provide insight into the development of expertise.
Article Preview
Top

Introduction

Expertise studies have long been an important tool for investigating how individuals develop skills in a given domain. A common approach for studying skill acquisition is to examine individuals who are identified as being at the “top” of their field (Bloom, 1985). However, gaining a clear sense of how an expert accomplishes a task is a challenge. Experts are not always consciously aware of all the knowledge and skills they use within their domain, making self-reports of expert technique somewhat unreliable (Feldon, 2007; Sullivan, Yates, Inaba, Lam & Clark, 2014). Eye-tracking is a methodology that has been used to examine expertise with regard to visual processing in a variety of domains such as medicine (Kundel, Nodine, Conant & Weinstein, 2007), games (Almeida, Veloso, Roque & Mealha, 2011), sports (North, Williams, Hodges, Ward & Ericsson, 2009) and aviation (Kasarkis, Stehwien, Hickox & Aretz, 2001). As eye-tracking demonstrates exactly where participants are focusing, it eliminates some of the ambiguity of what experts are actually doing at a given moment in time. In games research, proprietary in-game metrics of performance give another quantitative measure of an individual’s performance relative to other players. Often, these metrics include complex algorithms that encompass far more than just a measure of time played since total experience is a component of, but not necessarily a sufficient indicator for, expert performance (Ericsson & Lehmann, 1996; Gegenfurtner, Lehtinen & Saljo, 2011; Feltovich, Prietula & Ericsson, 2006). Utilizing methods that are not solely based on self-reports, but also include objective measures of skill may aid in better understanding how expertise develops (Tan, Leong & Shen, 2014). Such a mixed-method approach, as applied in this study, involving game metrics, eye-tracking and in-person interviews, can provide a range of measures through which to study expertise.

Games have long been used to study skill acquisition in performance among novices and experts. Early work by Chase and Simon (1973) and de Groot (1978), in chess, laid the groundwork for games and expertise studies by demonstrating novice-expert differences in a highly trackable environment. A commonly utilized definition of expert performance was proposed by Ericsson and Lehman (1996), “as consistently superior performance on a specified set of representative tasks for a domain,” and that is how we will operationally use the definition here. Ericsson and Smith (1991) highlighted the need to design studies that observe experts in a controlled setting, analyze their cognitive processes, and propose explicit learning mechanisms for skill acquisition. The introduction of computerized games, both chess and others, enables us to more specifically capture some of the elements Ericsson suggests and thereby better understand expertise within complex systems. In addition, since 97% of teens and 49% of adults play video games (Duggan, 2015; Lenhart, Kahne, Middaugh, Macgill, et al., 2008), research in this area is socially and developmentally relevant (Greenfield, DeWinstanley, Kilpatrick, & Kaye, 1994). Video games provide a fertile environment in which to study the development of expertise in a domain that has significant cultural capital. Greenfield et al. (1994) have argued that video games provide an excellent platform for studying skill acquisition because they involve goal-oriented behavior as well as instantaneous feedback.

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024)
Volume 15: 1 Issue (2023)
Volume 14: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing