Symmetric Therapeutic Frameworks and Ethical Dimensions in AI-Based Mental Health Chatbots (2020–2025): A Systematic Review of Design Patterns, Cultural Balance, and Structural Symmetry.

Uložené v:
Podrobná bibliografia
Názov: Symmetric Therapeutic Frameworks and Ethical Dimensions in AI-Based Mental Health Chatbots (2020–2025): A Systematic Review of Design Patterns, Cultural Balance, and Structural Symmetry.
Autori: Algumaei, Ali, Yaacob, Noorayisahbe Mohd, Doheir, Mohamed, Al-Andoli, Mohammed Nasser, Algumaie, Mohammed
Zdroj: Symmetry (20738994); Jul2025, Vol. 17 Issue 7, p1082, 30p
Predmety: MENTAL health services, GENERATIVE artificial intelligence, NATURAL language processing, COGNITIVE therapy, RESOURCE-limited settings, CHATBOTS
Abstrakt: Artificial intelligence (AI)-powered mental health chatbots have evolved quickly as scalable means for psychological support, bringing novel solutions through natural language processing (NLP), mobile accessibility, and generative AI. This systematic literature review (SLR), following PRISMA 2020 guidelines, collates evidence from 25 published, peer-reviewed studies between 2020 and 2025 and reviews therapeutic techniques, cultural adaptation, technical design, system assessment, and ethics. Studies were extracted from seven academic databases, screened against specific inclusion criteria, and thematically analyzed. Cognitive behavioral therapy (CBT) was the most common therapeutic model, featured in 15 systems, frequently being used jointly with journaling, mindfulness, and behavioral activation, followed by emotion-based approaches, which were featured in seven systems. Innovative techniques like GPT-based emotional processing, multimodal interaction (e.g., AR/VR), and LSTM-SVM classification models (greater than 94% accuracy) showed increased conversation flexibility but missed long-term clinical validation. Cultural adaptability was varied, and effective localization was seen in systems like XiaoE, okBot, and Luda Lee, while Western-oriented systems had restricted contextual adaptability. Accessibility and inclusivity are still major challenges, especially within low-resource settings, since digital literacy, support for multiple languages, and infrastructure deficits are still challenges. Ethical aspects—data privacy, explainability, and crisis plans—were under-evidenced for most deployments. This review is different from previous ones since it focuses on cultural adaptability, ethics, and hybrid public health incorporation and proposes a comprehensive approach for deploying AI mental health chatbots safely, effectively, and inclusively. Central to this review, symmetry is emphasized as a fundamental idea incorporated into frameworks for cultural adaptation, decision-making processes, and therapeutic structures. In particular, symmetry ensures equal cultural responsiveness, balanced user–chatbot interactions, and ethically aligned AI systems, all of which enhance the efficacy and dependability of mental health services. Recognizing these benefits, the review further underscores the necessity for more rigorous academic research into the development, deployment, and evaluation of mental health chatbots and apps, particularly to address cultural sensitivity, ethical accountability, and long-term clinical outcomes. [ABSTRACT FROM AUTHOR]
Copyright of Symmetry (20738994) is the property of MDPI and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Databáza: Complementary Index
Popis
Abstrakt:Artificial intelligence (AI)-powered mental health chatbots have evolved quickly as scalable means for psychological support, bringing novel solutions through natural language processing (NLP), mobile accessibility, and generative AI. This systematic literature review (SLR), following PRISMA 2020 guidelines, collates evidence from 25 published, peer-reviewed studies between 2020 and 2025 and reviews therapeutic techniques, cultural adaptation, technical design, system assessment, and ethics. Studies were extracted from seven academic databases, screened against specific inclusion criteria, and thematically analyzed. Cognitive behavioral therapy (CBT) was the most common therapeutic model, featured in 15 systems, frequently being used jointly with journaling, mindfulness, and behavioral activation, followed by emotion-based approaches, which were featured in seven systems. Innovative techniques like GPT-based emotional processing, multimodal interaction (e.g., AR/VR), and LSTM-SVM classification models (greater than 94% accuracy) showed increased conversation flexibility but missed long-term clinical validation. Cultural adaptability was varied, and effective localization was seen in systems like XiaoE, okBot, and Luda Lee, while Western-oriented systems had restricted contextual adaptability. Accessibility and inclusivity are still major challenges, especially within low-resource settings, since digital literacy, support for multiple languages, and infrastructure deficits are still challenges. Ethical aspects—data privacy, explainability, and crisis plans—were under-evidenced for most deployments. This review is different from previous ones since it focuses on cultural adaptability, ethics, and hybrid public health incorporation and proposes a comprehensive approach for deploying AI mental health chatbots safely, effectively, and inclusively. Central to this review, symmetry is emphasized as a fundamental idea incorporated into frameworks for cultural adaptation, decision-making processes, and therapeutic structures. In particular, symmetry ensures equal cultural responsiveness, balanced user–chatbot interactions, and ethically aligned AI systems, all of which enhance the efficacy and dependability of mental health services. Recognizing these benefits, the review further underscores the necessity for more rigorous academic research into the development, deployment, and evaluation of mental health chatbots and apps, particularly to address cultural sensitivity, ethical accountability, and long-term clinical outcomes. [ABSTRACT FROM AUTHOR]
ISSN:20738994
DOI:10.3390/sym17071082