Videogame Performance (Not Always) Requires Intelligence

Videogame Performance (Not Always) Requires Intelligence

M. Ángeles Quiroga, Francisco J. Román, Ana Catalán, Herman Rodríguez, Javier Ruiz, María Herranz, Marta Gómez-Abad, Roberto Colom
Copyright: © 2011 |Pages: 15
DOI: 10.4018/ijopcd.2011070102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This study was designed to test whether videogame performance requires intelligence even when practice periods are much longer than previously reported (Quiroga, Herranz, Gómez-Abad, Kebir, Ruiz, & Colom, 2009a). The study involved 27 university female undergraduates. Intelligence was measured using several tests both before and after videogame practice. Participants played videogames one day per week for five weeks completing five blocks of trials each day. Total practice consisted of twenty five blocks of trials (250 trials). The main finding shows that performance for some videogames is systematically related to intelligence along the practice period, indicating that basic abilities underlying these videogames cannot be easily automated. However, for some videogames, the relationship to intelligence is greatly reduced along the practice period. Ways to challenge intelligence using videogames are proposed from these findings.
Article Preview
Top

Introduction

Videogame Practice Effects

Nowadays, playing video games is a popular leisure activity for more than 15% of young people in Europe –aged 15 to 30- (Eurostat, 2009). These data come from the National Time Use Surveys carried out on 2007. These surveys consider a representative sample of individuals (controlling for age and gender) from each country, which have completed a diary during one weekday and one weekend day. The percentage of individuals investing free time playing video games is higher for individuals from 15 to 19 years (20%) than for individuals from 30 to 49 years (14%). Norway shows the highest percentages: 26% of individuals from 15-19 and 20% of individuals from 30 to 49. In the USA, individuals from 15 to 19 years spend more than an hour per day, while individuals from 35 to 54 years are those who spend less time per day (25 minutes per day on average). These data come from the Bureau of Labor Statistics and refer to civilian population in 2008 (Bureau of Labor statistics, 2009). These numbers are similar for men and women on weekdays (33 minutes and 28 minutes respectively) but not on weekend days (55 minutes for men and 31 minutes for women). Finally, individuals with a college degree play on average more time than those with less than a high school diploma (32 minutes and 19 minutes, respectively). See also Cummings and Vandewater (2007) and Hartmann and Klimmt (2006).

The summarized results above are relevant for at least two reasons. Firstly, new generation computer designers are trying to implement interactive devices instead of desktop and fixed-sized screens (Mistry, Maes, & Chang, 2009), which will mean that playing with an interactive console or following an online course will require the same skills from participants. Secondly, it has been claimed that playing computer games contributes to improve cognitive functions (Cherney, 2008; Kearney, 2005; Rosas et al., 2003; Sims & Mayer, 2002; Terlecki, Newcombe, & Little, 2008).

Improvement of cognitive functions should be carefully considered as there is some heterogeneity between the conditions carried out in these studies. Some sources of variability between these studies are (a) the criterion variable used to test game effects and its relation to the game and (b) the amount of practice time and participants’ previous experience.

Regarding the relationship between games and criterion measures, it is necessary to distinguish between positive transfer effects, domain specific improvement, and process-based improvement. Positive transfer effects take place when player performance benefits from previous learning due to similarity between the videogame and the learning criterion measure. Domain specific improvement occurs when players improve their performance in the learning task due to practice or training with the game because both, game and learning task, share the same content. Process based improvement happens when players improve their performance in a task which is not content related to the game. Wright, Thompson, Ganis, Newcombe, and Kosslyn (2008), working on training spatial skills, highlighted these differences.

Studies about videogames and cognitive abilities can be grouped following the above mentioned types of effects. Rosas et al. (2003) used video games to teach math and reading comprehension. Afterwards, students were tested on reading comprehension, math, and spelling with usual classroom learning tasks. Thus Rosas et al.’s study is an example of the transfer effects approach. Sims and Mayer (2002) used “Tetris” as practice and tested on spatial ability. Kearney (2005) used the game “Counter Strike” in the multiplayer network option and tested for multitasking abilities after practice. These two studies are examples of the domain specific improvement approach.

However, when the focus is on individual differences in absolute performance, the use of correlations between abilities and performance is the appropriate analysis (Voekle, Wittman, & Ackerman, 2006). Growth curve analysis would be more appropriate if the focus of the study was on individual differences in change through practice.

Therefore, regarding individual differences in absolute performance it would be expected that correlations through practice would convey: (a) positive transfer effects, if performance on the game improved but correlation between game performance and general cognitive ability decreased; (b) domain specific improvement, if game performance improved and correlation between performance and cognitive ability (content related with the task being practiced) remained at a similar value from the beginning to the end of the practice period, and (c) process based improvement, if performance improved and correlation between performance and general cognitive abilities remained at similar values from the beginning to the end of the practice period. Quiroga et al. (2009a) studied some games comprised in “Big Brain Academy® for the Wii console, correlating performance through practice with intelligence. Obtained correlations ranged from 0.49, at the beginning of the practice period, to 0.67 at the end (10 blocks of 10 trials each). Quiroga et al. (2009b) analyzed the game “Professor Layton and the Curious Village® (for Nintendo DS) correlating performance through practice with intelligence. Obtained correlations ranged from 0.37, at the beginning of the practice period, to 0.69 at the end (15 hours of playing). Both studies were interpreted as showing that moderate levels of complexity, low consistency, and novelty for all players, could be the main features underlying video-games requiring intelligence. These two studies illustrate the data expected if improvements are process based.

With respect to the amount of practice, researchers usually refer to intensive training or practice. However studies differ in the way intensive practice or training is measured. Rosas et al. (2003) employed 30 hours of practice along 3 months. Sims and Mayer (2002) used 12 hours of “Tetris” playing experience. Wright et al. (2008) employed daily practice sessions (15 to 20 minutes) along 21 consecutive days, and Quiroga et al. (2009b) used 16 hours of “Professor Layton and the Curious Village” playing experience. Therefore, when studying videogames, intensive practice could be defined in terms of: (a) spent time (hours) playing to complete the game in real life if videogame is a story or adventure the player must finish (as in “Counter Strike” or “Professor Layton and the Curious Village”); (b) spent time playing (repeating) the game or number of trials played (as in “Tetris” or in “Big Brain Academy”). The present study will follow the second option.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024)
Volume 13: 1 Issue (2023)
Volume 12: 4 Issues (2022)
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 4 Issues (2016)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing