Quality and Acceptance of Crowdsourced Translation of Web Content

Quality and Acceptance of Crowdsourced Translation of Web Content

Ajax Persaud, Steven O'Brien
Copyright: © 2017 |Pages: 16
DOI: 10.4018/IJTHI.2017010106
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Organizations make extensive use of websites to communicate with people. Often, visitors to their sites speak many different languages and expect that they will be served in their native language. Translation of web content is a major challenge for many organizations because of high costs and frequent changes in the content. Currently, organizations rely on professional translators or machines to translate their content. The challenge is that professional translations is costly and too slow while machine translations do not produce high quality or accurate translations even though they may be faster and less expensive. Crowdsourcing has emerged as a technique with many applications. The purpose of this research is to test whether crowdsourcing can produce equivalent or better quality translations than professional or machine translators. A crowdsourcing study was undertaken and the results indicate that the quality of crowdsourced translations was equivalent to professional translations and far better than machine translations. The research and managerial implications are discussed.
Article Preview
Top

Introduction

Crowdsourcing takes a job normally performed by a designated person and having it done by a large, undefined, and dispersed number of participants (Howe, 2008). Participants are recruited online to perform defined, sub-autonomous tasks (Behrend, Sharek, Meade, & Wiebe, 2011). Crowdsourcing leverages Web 2.0 tools (O'Reilly, 2007) to foster intellectual cooperation among human collectives in order to create, innovate and invent (Grasso & Convertino, 2012).

Crowdsourcing has been used in a variety of contexts including ideation generation (Schweitzer, Buchinger, Gassmann, & Obrist, 2012), citizen journalism (Tilley & Cokley, 2008), disaster and emergency management (Dailey & Starbird, 2014), scientific collaboration (Jirotka, Lee, & Olson, 2013), health care management (Adams, 2014), and linguistics (Munro et al., 2010). The study by Eagle (2009) is an early attempt to use crowdsourcing to translate single words and terminologies that have specific meanings among particular groups (e.g., engineers, accountants). Ledlie, Odero, Minkov, Kiss, and Polifroni (2010) used crowdsourcing for speech recognition translation i.e. converting speech to text. The use of crowdsourcing to translate website content that is longer, more descriptive, open to many interpretations and targeted to the general public rather than specific groups is a qualitatively more complex and difficult undertaking that has not yet been done but needed (Snow, O'Connor, Jurafsky, & Ng, 2008).

Translation of website content is important for several reasons. Today, users turn to the web as their first source to find information on virtually any topic, product or service. Therefore, it is important for organizations, businesses, and governments to make their content accessible to users in their native languages. Hutchins (2001) contends that people prefer to read content in their first language, even with some errors, than struggle to understand a website in a foreign language. DePalma, Sargent, and Beninatto (2006) found that language quality is a major factor influencing consumers’ online purchase decisions or online interactions with organizations. Consequently, it seems that organizations serving people that speak different languages should offer web content into multiple languages.

There is also an important social challenge pertaining to the digital divide associated with language barriers. Although the digital divide is often used to describe whether or not people have access to the Internet or information and communication technologies (ICTs), Selwyn (2004) posits that it can also be seen as a practical embodiment of the wider theme of social inclusion or digital exclusion. Selwyn (2004) argues that access to the Internet does not guarantee that users can effectively access every available website and online resource. Similarly, Van Dijk and Hacker (2003) argue that access to a technology is useless without the necessary skills, knowledge and support to use it effectively. According to Selwyn (2004), ‘meaningful’ use of the web requires that users are able to engage with the content. Moreover, content is only useful if it is relevant to the user and those who are isolated or marginalized face a digital divide (Selwyn, 2004). Thus, not serving users in their first language is a form of digital divide that needs bridging.

Translating website content is a formidable challenge because of the enormous amount of content on websites, which tend to change often. Translating web content using professional translators is slow and costly. On the other hand, machine translation, while faster and less expensive, produces inferior quality translations (Bowker, 2008). These challenges combined with the growing use of crowdsourcing motivate the question posed in this study: Can crowdsourcing produce equivalent or higher quality translations compared to those produced by professional or machine translators?

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 1 Issue (2023)
Volume 18: 7 Issues (2022): 4 Released, 3 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing