Empirical Investigation of Participation on Crowdsourcing Platforms: A Gamified Approach

Empirical Investigation of Participation on Crowdsourcing Platforms: A Gamified Approach

Abhishek Behl, Pratima Sheorey, Meena Chavan, Kokil Jain, Isha Jajodia
Copyright: © 2021 |Pages: 27
DOI: 10.4018/JGIM.20211101.oa14
Article PDF Download
Open access articles are freely available for download


Crowdsourcing platforms have gained importance in recent times, and their success is dependent mainly on the participation of the crowd. Participation is a function of both intrinsic and extrinsic motivation. Moreover, with the growing scale of information, the participants would need to focus on the quality of information to achieve sustainable participation. Our study uses game elements and information quality grounded in Motivational Affordance Perspective (MAP) to study the intrinsic and extrinsic participation on a crowdsourcing platform. We collected responses from 337 participants who are actively contributing to any crowdsourcing platform. Warp PLS uses partial least square structured equation modeling. The results confirm that the use of game elements positively promotes the participant’s intrinsic and extrinsic participation. We also confirmed that motivation is also positively moderated by the quality of information that the crowdsourcing platform shares with the participants. The results help in extending the theoretical arguments of MAP and self-determination theory.
Article Preview

1. Introduction

The consistently growing online population has generated volumes of knowledge and transformed the scope of “big data analytics” (BDA). Companies worldwide invariably seek quality and quantity of data to better understand the behavior of existing and potential customers. Studies assert that the role of crowdsourcing platforms includes understanding mass sentiment (Xintong et al., 2014). Crowdsourcing platforms are interactive portals that enable like-minded people to contribute knowledge, thoughts, views, and ideas towards a common goal. This, in turn, helps in tapping the wisdom of the crowd (Ye and Kankanhalli, 2017; Goh et al., 2017; Deng et al., 2016) across disciplines like healthcare, entertainment, societal development, elections, digital marketing, etc. Geiger and Schader (2014) discussed four archetypes of crowdsourcing platforms (crowd rating, crowd creating, crowd solving, and crowd processing) based on a 2x2 matrix on differentiating value between contributions and deriving value from contributors. Out of the four types, crowd creation is often used to develop or derive comprehensive artifacts based on various heterogeneous contributions from the crowd. Some common examples include user-generated content on channels like YouTube, videos posted on Facebook, or knowledge drawn from platforms like Wikipedia (Morschheuser et al., 2017).

Businesses use task-based crowdsourcing activities to engage the crowd and aim to derive solutions or ideas related to a defined problem (Deng et al., 2016). Howe (2008) defines organizational task crowdsourcing as an act of hiring/recruiting/engaging large groups of undefined random individuals often addressed as solvers. These individuals undertake an organizational task through an internet-based platform either maintained by the firm or a third party. The objective of such tasks is essentially linked to the profit of the firm. Moreover, organizations employ crowdsourcing to gain insights from a random crowd about a product that has either been launched or is about to be launched in a market. Examples include contests like “Doritos- Crash the Super Bowl”, “Starbucks- White Cup Contest,” “Lay’s -Do Us a Favour,” and “Airbnb Shorts.” Firms use platforms like Mechanical Turk, TopCoder, ZbJ, etc., for crowdsourcing tasks, gaining popularity, attracting advertisements, and earning commission to engage the crowd.

These platforms' common concern is the lack of crowd's continued engagement (Hossain, 2012; Kaufmann et al., 2011). Reasons include lower incentives for the crowd; irrelevant and irregular nudges to participate in activities; complex crowdsourcing platforms; vernacular and language issues; etc. (Morschheuser and Hamari, 2019; Zhu et al., 2014; Durward et al., 2016). The crowd and its continued participation determine the success of any crowdsourcing platform. Studies have claimed that crowd engagement through interactive crowdsourcing platforms has helped firms achieve better and efficient information sources (Morschheuser and Hamari, 2019; Lee et al., 2013). Sustained and meaningful crowd engagement is ensured through appropriate portal design and game mechanics application; this helps better understand crowd behavior. Lately, crowdsourcing platforms have also started using game elements to capture and prolong crowd attention. Tomnod’s digital initiative employs individual core drives of Octalysis gamification, such as core drive 1# Epic Meaning and Calling, core drive 5#, social influence, and relatedness to promote innovation and creativity. Other examples of gamified crowdsourcing include Google Image labeler and Foldit. Google image labeler is one of the earliest examples that integrated game elements to encourage participation. Different crowdsourcing applications use a range of gamification principles depending upon their objectives and the quality of experience they seek to provide. Enhancing user experience by making it more immersive through gamification can benefit the crowdsourcing endeavour.

Complete Article List

Search this Journal:
Volume 32: 1 Issue (2024)
Volume 31: 9 Issues (2023)
Volume 30: 12 Issues (2022)
Volume 29: 6 Issues (2021)
Volume 28: 4 Issues (2020)
Volume 27: 4 Issues (2019)
Volume 26: 4 Issues (2018)
Volume 25: 4 Issues (2017)
Volume 24: 4 Issues (2016)
Volume 23: 4 Issues (2015)
Volume 22: 4 Issues (2014)
Volume 21: 4 Issues (2013)
Volume 20: 4 Issues (2012)
Volume 19: 4 Issues (2011)
Volume 18: 4 Issues (2010)
Volume 17: 4 Issues (2009)
Volume 16: 4 Issues (2008)
Volume 15: 4 Issues (2007)
Volume 14: 4 Issues (2006)
Volume 13: 4 Issues (2005)
Volume 12: 4 Issues (2004)
Volume 11: 4 Issues (2003)
Volume 10: 4 Issues (2002)
Volume 9: 4 Issues (2001)
Volume 8: 4 Issues (2000)
Volume 7: 4 Issues (1999)
Volume 6: 4 Issues (1998)
Volume 5: 4 Issues (1997)
Volume 4: 4 Issues (1996)
Volume 3: 4 Issues (1995)
Volume 2: 4 Issues (1994)
Volume 1: 4 Issues (1993)
View Complete Journal Contents Listing