Too human and not human enough: A grounded theory analysis of mental health harms from emotional dependence on the social chatbot Replika

Social chatbot (SC) applications offering social companionship and basic therapy tools have grown in popularity for emotional, social, and psychological support. While use appears to offer mental health benefits, few studies unpack the potential for harms. Our grounded theory study analyzes mental h...

Full description

Saved in:
Bibliographic Details
Published in:New media & society Vol. 26; no. 10; pp. 5923 - 5941
Main Authors: Laestadius, Linnea, Bishop, Andrea, Gonzalez, Michael, Illenčík, Diana, Campos-Castillo, Celeste
Format: Journal Article
Language:English
Published: London, England SAGE Publications 01.10.2024
Subjects:
ISSN:1461-4448, 1461-7315
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Social chatbot (SC) applications offering social companionship and basic therapy tools have grown in popularity for emotional, social, and psychological support. While use appears to offer mental health benefits, few studies unpack the potential for harms. Our grounded theory study analyzes mental health experiences with the popular SC application Replika. We identified mental health relevant posts made in the r/Replika Reddit community between 2017 and 2021 (n = 582). We find evidence of harms, facilitated via emotional dependence on Replika that resembles patterns seen in human–human relationships. Unlike other forms of technology dependency, this dependency is marked by role-taking, whereby users felt that Replika had its own needs and emotions to which the user must attend. While prior research suggests human–chatbot and human–human interactions may not resemble each other, we identify social and technological factors that promote parallels and suggest ways to balance the benefits and risks of SCs.
ISSN:1461-4448
1461-7315
DOI:10.1177/14614448221142007