Article Preview
TopIntroduction
Live streaming is a mixed media form (Hamilton, Garretson, & Kerne, 2014) that is different from traditional social media in that it is considered a synchronous media with unique attributes such as simultaneity (Scheibe, Fietkiewicz, & Stock, 2016) and authenticity (Tang, Venolia, & Inkpen, 2016) and allows users (broadcasters and viewers) to interact with each other in real time through live video and chat (Wohn, Freeman, & McLaughlin, 2018). The live streaming platform, Twitch, is one of the leading live streaming video service providers that originally focused on games but is increasingly extending to creative content and mobile broadcasting. As of September 2018, Twitch had numerous content categories including IRL (in real life), Creative, Food & Drink, and Travel & Outdoors (Roger, 2018). The streamers are content creators and broadcasters of gameplay or other categories; viewers watch the streaming video then send messages to the streamer or other viewers in a chat interface that is adjacent to the streaming video.
The popularity of live streams and the success of Twitch have made it a growing subject of academic attention. Most current research on live streams, however, focuses on streamers and viewers, such as streamer or viewer motives (Cai & Wohn, 2019; Cai, Wohn, Mittal, & Sureshbabu, 2018; Friedländer, 2017; Scheibe et al., 2016) and streamer-viewer interactions (Lu, Xia, Heo, & Wigdor, 2018; Wohn et al., 2018), with less but growing attention on the prominent but hidden role of human moderators (Seering, Wang, Yoon, & Kaufman, 2019; Wohn, 2019).
Prior research defines content moderation as “the organized practice of screening user-generated content posted to internet sites, social media, and other online outlets, in order to determine the appropriateness of the content for a given site, locality, or jurisdiction” (Roberts, 2017). Generally, moderators perceive their roles as “filter, firefighter, discussion leader, and content expert” and they moderate content to guide the discussion and to keep down “flames” (Berge & Collins, 2000). Commercial content moderators, who are paid workers, curate content and guard against violations such as racism, homophobic slurs, pornography, and violence (Roberts, 2016). These commercial content moderators usually review inappropriate content that has been flagged by users or detection algorithms. Twitch has commercial content moderators but also enables streamers to appoint their own moderators (also known as “mods”). Mods voluntarily assist the streamer in managing the chat content and are usually unpaid (Wohn, 2019).
Due to the synchronicity of live streams, all the messages are flowing in the chatroom in real time, posing different challenges compared to asynchronous communities such as Wikipedia and Reddit, which also rely largely on volunteer moderators. Technical interventions can, to some extent, reduce the human moderation load, especially in large and fast-moving chats (AnyKey, 2016). Many online communities, such as Reddit and Twitch, apply bots (software robots) to assist the mods in doing moderation practice (Seering, Wang et al., 2019). Current research about using bots for content moderation mainly focus on asynchronous communities such as Reddit (Gilbert, 2013; Long et al., 2017) and Wikipedia (Clément & Guitton, 2015; Müller-Birn, Dobusch, & Herbsleb, 2013), with limited research about bots for moderation on Twitch (Seering, Luria, Kaufman, & Hammer, 2019). Better understanding the moderation tools that mods use every day would help improve the current tool design, reduce the working load of mods, and further benefit the community. The goal of this research is to analyze the features of moderation tools on Twitch into categories that could be generalizable to all other moderation tools and to provide some implications for future tool design.