Vol. 7 No. 1 (2025): January - March 2025
Clinical psychology

Editorial: Mental Health 4.0: the contribution of LLM models in mental health care processes.

Valeria Cioffi
SiPGI - Postgraduate School of Integrated Gestalt Psychotherapy, Torre Annunziata, Napoli, Italia
logo

Published 2025-03-14

Keywords

  • Artificial Intelligence in Psychotherapy,
  • Large Language Models (LLM),
  • Therapist-Patient Relationship

How to Cite

Cioffi, V. (2025). Editorial: Mental Health 4.0: the contribution of LLM models in mental health care processes. Phenomena Journal - International Journal of Psychopathology, Neuroscience and Psychotherapy, 7(1), 38–40. https://doi.org/10.32069/PJ.2021.2.242

Abstract

"There are frontiers where we are learning, and our desire for knowledge burns. They lie in the deepest fabric of space, in the origins of the cosmos, in the nature of time, in the fate of black holes, and in the workings of our own thought. Here, at the edge of what we know, in contact with the ocean of what we do not know, shines the mystery of the world, the beauty of the world, leaving us breathless." Rovelli, C. (2014). Sette brevi lezioni di fisica. Milano: Adelphi.​

 

In recent years, the emergence of Large Language Models (LLMs) has demonstrated remarkable potential in the analysis of human language. In the field of psychological sciences, these tools have enabled the development of emerging research domains, including psychology, psychopathology, and, more recently, computational psychotherapy [1], facilitating interdisciplinary research progress in these areas.

The therapist/patient dyad, as well as the supervisee/supervisor dyad, constitutes a complex system whose multidimensional functions require nonlinear statistical methods to allow for a deep and accurate analysis.

The study of psychotherapeutic process paradigms, in fact, benefits from an interdisciplinary approach that integrates clinical psychology knowledge with the most advanced artificial intelligence (AI) technologies, thus opening new avenues of study and application.

Whether and how AI can support functions traditionally performed by humans, without replacing the profound process of understanding what remains a subjective and intersubjective experience, is a highly complex and multifaceted topic.

The integration of AI into psychotherapy remains an open challenge, especially in light of the rapid advancements in LLM efficiency, which in the past two years, have drawn enormous attention and ignited debates about their applications in medicine and public health [2].

Like all Copernican revolutions throughout human history, this one also evokes a strong sense of uncertainty, raising both questions and deep-seated fears. The spontaneous question that arises today is not only whether AI can effectively serve as a valid and functional auxiliary tool for mental health professionals but, more importantly, whether we can truly imagine a future where a chatbot could completely replace a human professional in this field.

ChatGPT (Chat Generative Pre-trained Transformer) [3] is a software that simulates and processes human conversations (written or spoken) based on AI and machine learning, developed by OpenAI (an artificial intelligence research laboratory) and specialized in interacting with human users. Its version 3.5, implemented in 2022, has enormous potential: it is easy to use and can engage in authentic conversations indistinguishable from human communication. Furthermore, considering that ChatGPT was developed using supervised and reinforcement learning, the quality of chatbot-generated content can be significantly improved through further training and optimizations [4].

The debate on the potential uses of ChatGPT in psychotherapy is still in its early stages, and, therefore, significant concerns and ethical discussions are emerging within the mental health expert community and its related scientific fields. Some mental health professionals have described various chatbot applications, including assisting psychiatrists with routine tasks such as filling out medical records, facilitating doctor-patient communication, and improving written and academic communication [5]. Recently, interesting developments have emerged with some startups launching GPT-based chatbots capable of providing basic psychological support. However, the question remains open as to whether these tools can truly offer effective and comprehensive support to patients without reducing the complexity inherent in the therapeutic relationship to a mere algorithmic interaction.

Numerous future advancements are conceivable for chatbots assisting in mental health, including other tasks such as emotion recognition, personality assessment, and the identification of risk factors, as well as the ability to detect early signs of psychological distress [6]. Several authors [7] have hypothesized that, in the near future, it will be possible to develop a fully automated psychotherapy system based on GPT technology, trained to provide expert communication aligned with a specific therapeutic paradigm. Studies have highlighted numerous benefits of using LLMs for psychotherapy while also identifying potential challenges and inaccuracies.

First and foremost, to maximize results and minimize errors, the chatbot requires proper training with well-structured guidelines [8, 9, 10, 11, 12].
Such training should enable the construction of clinically validated psychotherapy conversation protocols by integrating elements of established and validated clinical practice with the technological tool’s potential, including the speed of evaluation processes and its continuous availability. In the future, speech comprehension/generation modules and emotion recognition algorithms based on facial expressions and physiological inputs from neuropsychophysiological monitoring devices should also be integrated.

We may not yet be able to offer a definitive and complete answer to the question that initiated this reflection, but rather a thought-provoking perspective, which, supported by an ethically critical approach, envisions the future of psychotherapy as significantly influenced by artificial intelligence while also recognizing the true challenge: integrating AI use without compromising the centrality of the human relationship.

Thus, while AI offers powerful and innovative tools, lingering doubt remains as to whether it will ever fully replace the experience, empathy, complexity, and 'warmth' of therapeutic interaction. A romantic and deeply humanistic vision, which profoundly guides our research group, aspires to affirm that rather than a futuristic replacement, a balanced integration between technology and mental health professionals would be desirable—an equilibrium that provides support to practitioners while never losing sight of what numerous scientific studies have established as the true essence of care: the human relationship.

References

  1. Cioffi, V., Mosca, L. L., Moretto, E., Ragozzino, O., Stanzione, R., Bottone, M., Maldonato N. M., Muzii B., & Sperandeo, R. (2022). Metodi computazionali in psicoterapia: A Scoping Review. International Journal of Environmental Research and Public Health, 19(19), 12358.
  2. Patel, S. B., Lam, K., & Liebrenz, M. (2023). ChatGPT: friend or foe. Lancet Digit Health, 5(3), e102.
  3. Liebrenz, M., Schleifer, R., Buadze, A., Bhugra, D., & Smith, A. (2023). Generating scholarly content with ChatGPT: ethical challenges for medical publishing. Lancet Digital Health, 5(3), e105-e106.
  4. Pierucci, P. Introduzione a Chat GPT: cos’è, vantaggi e casi studio, su www.digitaldictionary.it. URL consultato il 27 novembre 2023.
  5. Alkaissi, H., & McFarlane, S. I. (2023). Allucinazioni artificiali nella ChatGPT: implicazioni nella scrittura scientifica. Cureus, 15(2).
  6. Cheng, S. W., Chang, C. W., Chang, W. J., Wang, H. W., Liang, C. S., Kishimoto, T., Chang J. P., Kuo J. S., & Su, K. P. (2023). Il presente e il futuro della ChatGPT e della GPT in psichiatria. Psichiatria e neuroscienze cliniche, 77(11), 592-596.
  7. Farhat, F. (2023). La ChatGPT come risorsa complementare per la salute mentale: una manna o una rovina. Annali di Ingegneria Biomedica, 1-4.
  8. Raile, P. (2024). L’utilità della ChatGPT per psicoterapeuti e pazienti. Comunicazioni di scienze umane e sociali, 11(1), 1-8.
  9. Eshghie, M., & Eshghie, M. (2023). ChatGPT come assistente terapeuta: uno studio di idoneità. arXiv preprint arXiv:2304.09873.
  10. Ajlouni, A., Almahaireh, A., & Whaba, F. (2023). La percezione degli studenti sull’uso della chatgpt nell’educazione alla consulenza e alla salute mentale: I benefici e le sfide. International Journal of Emerging Technologies in Learning (iJET), 18(20), 199-218.
  11. Singh, O. P. (2023). L’intelligenza artificiale nell’era della ChatGPT-Opportunità e sfide nella cura della salute mentale. Indian Journal of Psychiatry, 65(3), 297-298.
  12. Vowels, L. M., Francois-Walcott, R., & Darwiche, J. (2023). Ai nella consulenza relazionale: Valutazione dell’efficacia terapeutica della chatgpt nel fornire consulenza relazionale.
  13. Alanezi, F. (2024). Valutare l’efficacia della ChatGPT nel fornire supporto alla salute mentale: uno studio qualitativo. Journal of Multidisciplinary Healthcare, 461-471.