Therapists are secretly using ChatGPT during sessions. Clients are triggered.

Therapists are secretly using ChatGPT during sessions. Clients are triggered.

In 2020, a significant data breach involving a Finnish mental health company resulted in the compromise of treatment records for thousands of clients. The incident not only led to blackmail attempts but also the public release of highly sensitive information, including experiences of child abuse and addiction.

Concerns around data privacy and other risks arise when psychotherapists utilize large language models (LLMs) like ChatGPT for client consultations. Research indicates that while some specialized therapy bots can provide comparable support to human therapists, tools such as ChatGPT may inadvertently cause harm. A recent study by Stanford University highlighted that chatbots could reinforce delusional thoughts and engage in biased responses, raising questions about their suitability for professional use. For instance, these tools might validate a therapist’s unsubstantiated theories or misguide their reasoning.

Within educational settings, some instructors are experimenting with AI tools; for example, a mental health trainer has engaged ChatGPT by simulating client symptoms for diagnostic input. However, he notes that while the chatbot generates numerous potential diagnoses, the depth of its analysis is lacking. The American Counseling Association currently advises against the use of AI for mental health diagnoses.

A 2024 study on an earlier version of ChatGPT found similar limitations, noting its tendencies towards vague generalities in diagnosis and a bias towards recommending cognitive behavioral therapy, rather than considering a broader range of therapeutic options. Daniel Kimmel, a psychiatrist at Columbia University, also explored ChatGPT’s responses to relationship issues, remarking that while the tool mimicked therapeutic techniques adequately, it failed to synthesize information or provide cohesive insights.

The discussion around AI use in therapy highlights a tension: while it may offer time-saving advantages, experts warn that these gains must be carefully balanced against the complexities of patient care.

Source: https://www.technologyreview.com/2025/09/02/1122871/therapists-using-chatgpt-secretly/

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top