Conducting Survey Research Using MTurk

Conducting Survey Research Using MTurk

Silvana Chambers, Kim Nimon
DOI: 10.4018/978-1-5225-5164-5.ch016
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter presents an introduction to crowdsourcing for survey participant recruitment. It also discusses best practices and ethical considerations for conducting survey research using Amazon Mechanical Turk (MTurk). Readers will learn the benefits, limitations, and trade-offs of using MTurk as compared to other recruitment services, including SurveyMonkey and Qualtrics. A synthesis of survey design guidelines along with a sample survey are presented to help researchers collect the best quality data. Techniques, including SPSS and R syntax, are provided that demonstrate how users can clean resulting data and identify valid responses for which workers could be paid. An overview and syntax for conducting longitudinal studies is provided as well.
Chapter Preview
Top

Conducting Survey Research Using Mturk

Over the past 25 years, the Internet has progressively become part of how we live. It has changed the way in which we communicate and exchange knowledge. Consequently, the Internet has become a tool for conducting academic and organizational research (Callegaro, Baker, Bethlehem, Goritz, Krosnick, & Lavrakas, 2014; Granello & Wheaton, 2004; Oppenheimer, Pannucci, Kasten, & Haase, 2011). In the field of human resource development (HRD), for example, is not uncommon for findings from survey research to inform theory and/or practice (cf. Gubbins & Roussea, 2015; Shuck & Reio, 2011).

A growing trend in the psychological and behavioral sciences is the use of crowdsourcing for recruiting survey participants (Palmer & Strickland, 2016). Crowdsourcing is a technical innovation that refers to the process of obtaining content by soliciting contributions from a large and diverse pool of people, particularly from online communities. The use of crowdsourcing for academic research has increased over the past decade (Harms & DeSimone, 2015). In fact, many reputable journals have published articles for which crowdsourcing methods were utilized (Palmer & Strickland, 2016). Today, multiple crowdsourcing platforms are available, facilitating the link between researchers and populations of potential participants (e.g., MTurk, SurveyMonkey Audience, Qualtrics Panels, StudyResponse, and CrowdFlower).

Despite its growing acceptance and capability of enabling researchers to reach large numbers of people, a recent study suggests that researchers have concerns, often unsupported, with crowdsourcing platforms; thus, they are skeptical to adopt crowdsourcing as a methodological tool for academic research (Law, Gajos, Wiggins, Gray, & Williams, 2017). One of the most popular commercial crowdsourcing services used by social science researchers to recruit participants is MTurk (Buhrmester, Kwang, & Gosling, 2011). Recent studies have found that data collected through MTurk were as good or better than data collected by more traditional survey methods (Behrend, Sharek, Meade, & Wiebe, 2011; Feitosa, Joseph, & Newman, 2015).

In this chapter, we offer a primer to researchers interested in using MTurk for survey research. First, we present an introduction to crowdsourcing for recruiting survey participants. Second, we present MTurk as a tool for conducting survey research, discuss ethical considerations, and compare it to other crowdsourcing services used for survey participant recruitment (i.e., SurveyMonkey, Qualtrics). Third, we discuss the limitations and benefits associated with MTurk. Fourth, we synthesize best practices for designing a survey to be deployed on MTurk and present an associated example. Fifth, we review the implications of collecting data from MTurk and provide R (see Appendix A) and SPSS syntax (see Appendix B) that may be helpful starting places for researchers who are new to collecting data from MTurk.

Complete Chapter List

Search this Book:
Reset