Seeking Emotional and Mental Health Support From Generative AI: Mixed-Methods Study of ChatGPT User Experiences.

Uloženo v:
Podrobná bibliografie
Název: Seeking Emotional and Mental Health Support From Generative AI: Mixed-Methods Study of ChatGPT User Experiences.
Autoři: Luo X; Department of Counseling Psychology, Santa Clara University, 500 El Camino Real, Santa Clara, CA, 95053, United States, 1 408-551-1603., Wang Z; Encounter Psychotherapy LLC, Gaithersburg, MD, United States., Tilley JL; Psychology and Child and Human Development Group, National Institution of Education, Nanyang Technological University, Singapore, Singapore., Balarajan S; Department of Counseling Psychology, Santa Clara University, 500 El Camino Real, Santa Clara, CA, 95053, United States, 1 408-551-1603., Bassey UA; Department of Counseling Psychology, Santa Clara University, 500 El Camino Real, Santa Clara, CA, 95053, United States, 1 408-551-1603., Cheang CI; Department of Counseling Psychology, Santa Clara University, 500 El Camino Real, Santa Clara, CA, 95053, United States, 1 408-551-1603.
Zdroj: JMIR mental health [JMIR Ment Health] 2025 Nov 27; Vol. 12, pp. e77951. Date of Electronic Publication: 2025 Nov 27.
Způsob vydávání: Journal Article
Jazyk: English
Informace o časopise: Publisher: JMIR Publications Inc Country of Publication: Canada NLM ID: 101658926 Publication Model: Electronic Cited Medium: Internet ISSN: 2368-7959 (Electronic) Linking ISSN: 23687959 NLM ISO Abbreviation: JMIR Ment Health Subsets: MEDLINE
Imprint Name(s): Original Publication: Toronto : JMIR Publications Inc., [2014]-
Výrazy ze slovníku MeSH: Artificial Intelligence* , Mental Disorders*/therapy , Mental Disorders*/psychology , Mental Health Services*, Humans ; Female ; Male ; Adult ; Middle Aged ; Qualitative Research ; Emotions ; Young Adult ; Surveys and Questionnaires ; Mental Health ; Generative Artificial Intelligence
Abstrakt: Background: Generative artificial intelligence (GenAI) models have emerged as a promising yet controversial tool for mental health.
Objective: The purpose of this study is to understand the experiences of individuals who repeatedly used ChatGPT (GenAI) for emotional and mental health support (EMS).
Methods: We recruited 270 adult participants across 29 countries who regularly used ChatGPT (OpenAI) for EMS during April 2024. Participants responded to quantitative survey questions on the frequency and helpfulness of using ChatGPT for EMS, and qualitative questions regarding their therapeutic purposes, emotional experiences of using, and perceived helpfulness and rationales. Thematic analysis was used to analyze qualitative data.
Results: Most participants reported using ChatGPT for EMS at least 1-2 times per month for purposes spanning traditional mental health needs (diagnosis, treatment, and psychoeducation) and general psychosocial needs (companionship, relational guidance, well-being improvement, and decision-making). Users reported various emotional experiences during and after use for EMS (eg, connected, relieved, curious, embarrassed, or disappointed). Almost all users found it at least somewhat helpful. The rationales for perceived helpfulness include perceived changes after use, emotional support, professionalism, information quality, and free expression, whereas the unhelpful aspects include superficial emotional engagement, limited information quality, and lack of professionalism.
Conclusions: Despite the absence of ethical regulations for EMS use, GenAI is becoming an increasingly popular self-help tool for emotional and mental health support. These results highlight the blurring boundary between formal mental health care and informal self-help and underscore the importance of understanding the relational and emotional dynamics of human-GenAI interaction. There is an urgent need to promote AI literacy and ethical awareness among community users and health care providers and to clarify the conditions under which GenAI use for mental health promotes well-being or poses risk.
(© Xiaochen Luo, Zixuan Wang, Jacqueline L Tilley, Sanjeev Balarajan, Ukeme-Abasi Bassey, Choi Ieng Cheang. Originally published in JMIR Mental Health (https://mental.jmir.org).)
References: J Behav Health Serv Res. 2021 Oct;48(4):537-553. (PMID: 33474642)
Am Psychol. 2018 Jan;73(1):26-46. (PMID: 29345485)
PLoS One. 2023 Mar 14;18(3):e0279720. (PMID: 36917576)
Lancet Psychiatry. 2023 Sep;10(9):668-681. (PMID: 37531964)
J Multidiscip Healthc. 2024 Jan 31;17:461-471. (PMID: 38314011)
Front Psychol. 2023 May 26;14:1199058. (PMID: 37303897)
Npj Ment Health Res. 2024 Oct 27;3(1):48. (PMID: 39465310)
Soc Psychiatry Psychiatr Epidemiol. 2006 Jan;41(1):44-9. (PMID: 16341828)
Digit Health. 2025 Jul 10;11:20552076251351088. (PMID: 40656852)
Contributed Indexing: Keywords: ChatGPT; generative artificial intelligence (GenAI); help-seeking behavior; mental health and emotional support; perceived helpfulness
Entry Date(s): Date Created: 20251128 Date Completed: 20251128 Latest Revision: 20251201
Update Code: 20251201
PubMed Central ID: PMC12661908
DOI: 10.2196/77951
PMID: 41313214
Databáze: MEDLINE
Popis
Abstrakt:Background: Generative artificial intelligence (GenAI) models have emerged as a promising yet controversial tool for mental health.<br />Objective: The purpose of this study is to understand the experiences of individuals who repeatedly used ChatGPT (GenAI) for emotional and mental health support (EMS).<br />Methods: We recruited 270 adult participants across 29 countries who regularly used ChatGPT (OpenAI) for EMS during April 2024. Participants responded to quantitative survey questions on the frequency and helpfulness of using ChatGPT for EMS, and qualitative questions regarding their therapeutic purposes, emotional experiences of using, and perceived helpfulness and rationales. Thematic analysis was used to analyze qualitative data.<br />Results: Most participants reported using ChatGPT for EMS at least 1-2 times per month for purposes spanning traditional mental health needs (diagnosis, treatment, and psychoeducation) and general psychosocial needs (companionship, relational guidance, well-being improvement, and decision-making). Users reported various emotional experiences during and after use for EMS (eg, connected, relieved, curious, embarrassed, or disappointed). Almost all users found it at least somewhat helpful. The rationales for perceived helpfulness include perceived changes after use, emotional support, professionalism, information quality, and free expression, whereas the unhelpful aspects include superficial emotional engagement, limited information quality, and lack of professionalism.<br />Conclusions: Despite the absence of ethical regulations for EMS use, GenAI is becoming an increasingly popular self-help tool for emotional and mental health support. These results highlight the blurring boundary between formal mental health care and informal self-help and underscore the importance of understanding the relational and emotional dynamics of human-GenAI interaction. There is an urgent need to promote AI literacy and ethical awareness among community users and health care providers and to clarify the conditions under which GenAI use for mental health promotes well-being or poses risk.<br /> (© Xiaochen Luo, Zixuan Wang, Jacqueline L Tilley, Sanjeev Balarajan, Ukeme-Abasi Bassey, Choi Ieng Cheang. Originally published in JMIR Mental Health (https://mental.jmir.org).)
ISSN:2368-7959
DOI:10.2196/77951