(In)visible moderation: A digital ethnography of marginalized users and content moderation on Twitch and Reddit

Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bia...

Full description

Saved in:
Bibliographic Details
Published in:New media & society Vol. 26; no. 7; pp. 4034 - 4055
Main Authors: Thach, Hibby, Mayworm, Samuel, Delmonaco, Daniel, Haimson, Oliver
Format: Journal Article
Language:English
Published: London, England SAGE Publications 01.07.2024
Subjects:
ISSN:1461-4448, 1461-7315
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bias, we conducted a digital ethnography of marginalized users on Reddit’s /r/FTM subreddit and Twitch’s “Just Chatting” and “Pools, Hot Tubs, and Beaches” categories, observing content moderation visibility in real time. We found that on Reddit, a text-based platform, platform tools make content moderation practices invisible to users, but moderators make their practices visible through communication with users. Yet on Twitch, a live chat and streaming platform, content moderation practices are visible in channel live chats, “unban appeal” streams, and “back from my ban” streams. Our ethnography shows how content moderation visibility differs in important ways between social media platforms, harming those who must see offensive content, and at other times, allowing for increased platform accountability.
ISSN:1461-4448
1461-7315
DOI:10.1177/14614448221109804