(In)visible moderation: A digital ethnography of marginalized users and content moderation on Twitch and Reddit

Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bia...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:New media & society Ročník 26; číslo 7; s. 4034 - 4055
Hlavní autori: Thach, Hibby, Mayworm, Samuel, Delmonaco, Daniel, Haimson, Oliver
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: London, England SAGE Publications 01.07.2024
Predmet:
ISSN:1461-4448, 1461-7315
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bias, we conducted a digital ethnography of marginalized users on Reddit’s /r/FTM subreddit and Twitch’s “Just Chatting” and “Pools, Hot Tubs, and Beaches” categories, observing content moderation visibility in real time. We found that on Reddit, a text-based platform, platform tools make content moderation practices invisible to users, but moderators make their practices visible through communication with users. Yet on Twitch, a live chat and streaming platform, content moderation practices are visible in channel live chats, “unban appeal” streams, and “back from my ban” streams. Our ethnography shows how content moderation visibility differs in important ways between social media platforms, harming those who must see offensive content, and at other times, allowing for increased platform accountability.
ISSN:1461-4448
1461-7315
DOI:10.1177/14614448221109804