According to Forbes, there’s a surprising new trend where AI mental health apps are being used to evaluate human therapists rather than just helping patients directly. With 400 million weekly ChatGPT users and many seeking mental health guidance, people are now bringing AI into therapy sessions as a “second opinion” on their therapist’s advice. This creates a new therapist-AI-client triad that’s fundamentally changing traditional therapy dynamics. The practice is happening both openly and secretly, with many patients not telling their therapists they’re using AI to assess their performance. Unsurprisingly, many mental health professionals are pushing back against what they see as intrusive AI monitoring of their work.
The Therapy Third Wheel
Here’s the thing – this isn’t just about people occasionally asking ChatGPT for advice. We’re talking about patients systematically using AI to evaluate every piece of guidance their human therapist provides. Some do it openly, telling their therapist they want AI involved in the relationship. Others do it secretly, worried their therapist would object. Either way, it creates this weird dynamic where there’s suddenly a third party in what’s traditionally been a deeply private, one-on-one relationship.
And honestly, I get why patients are doing this. Therapy is expensive, therapists are human and can make mistakes, and having a free, always-available second opinion seems smart on paper. But here’s where it gets messy – AI doesn’t understand therapeutic context or long-term treatment plans. It might suggest direction B when your therapist is carefully guiding you toward A for very good reasons that require professional judgment.
Therapist Backlash
Unsurprisingly, mental health professionals aren’t thrilled about having their work graded by algorithms. Their concerns are pretty valid when you think about it. First, there’s the privacy issue – anything you tell most AI apps isn’t actually private. The companies can read your conversations and use them for training. So much for confidentiality.
Then there’s the disruption factor. Imagine your therapist is carefully building toward a breakthrough over several sessions, and you keep running to AI after each appointment. The AI might contradict the approach, send you down rabbit holes, or make you question your therapist’s competence based on incomplete information. Basically, it turns therapy into a debate club rather than a healing process.
AI in the Therapy Room
But here’s the counterargument – AI isn’t going away, and therapists might need to adapt rather than resist. Generative AI is cheap, available 24/7, and patients are already using it. The smart approach might be for therapists to acknowledge this reality and incorporate AI understanding into their practice. They could educate patients about AI’s limitations while using their professional expertise to contextualize whatever insights the technology provides.
Some forward-thinking therapists are even experimenting with using AI themselves to stay current with what patients might encounter. That way, when a patient says “But ChatGPT told me…” the therapist can actually have an informed conversation about why the AI might be missing crucial context or nuance.
Where This Leads
The really concerning part is where this could go next. What if healthcare providers start using AI to automatically monitor their therapists’ sessions? We’re already seeing hints of this – AI could analyze session transcripts or even live-stream therapy to ensure “quality control.” That creates all sorts of ethical questions about surveillance, trust, and whether you can algorithmically measure something as complex as therapeutic effectiveness.
Look, AI in mental health is here to stay. But using it to grade human therapists feels like we’re solving the wrong problem. Instead of making therapists prove themselves to algorithms, maybe we should focus on how AI can actually support the therapeutic process without turning into Big Brother. The technology itself isn’t the issue – it’s how we choose to use it that matters.
