Trauma Therapist Rates Mental Health Advice Given By ChatGPT
Trauma is more complex than AI understands.
Overcoming trauma is far from a linear journey, and having professional guidance is often the best way to work through a complex set of overlapping mental health issues.
Ellie Rose, a therapist on TikTok, conducted an experiment on the intersection of mental health and technology, by seeing what kind of answers AI would give about healing from trauma.
The trauma therapist rated mental health advice from ChatGPT.
Rose, a former family therapist specializing in childhood trauma, offered an initial prediction on mental health advice sourced from AI, saying, “Given that ChatGPT collates everything on the internet, I’m expecting it to be pretty bad.”
Rose sat in front of her laptop and typed out her first question: “How do I heal from my childhood trauma?”
She was pleasantly surprised by the first result that ChatGPT gave, which was to “Seek professional help.”
ChatGPT’s second response to the question of how to heal from trauma was a little more out of touch. It declared, “Develop self-awareness.” The AI function recommended ways to develop self-awareness, suggesting “Things like journaling, mindfulness, and meditation.”
Rose rated that answer as a 5 out of 10, noting, “If you’ve been through childhood trauma, you’re probably pretty self-aware, and that’s not the problem.”
ChatGPT’s third suggestion on how to overcome trauma was simply to practice self-care by focusing on exercise, diet, and sleep, which Rose rated as “Good, kind of, 6 out of 10.”
“Setting boundaries,” was the next suggestion, although Rose shared, “That’s what it says, it doesn’t give you any tips.”
As anyone who was raised in an unstable family structure or who struggles with people-pleasing knows, setting boundaries isn’t as easy as just declaring that you’re setting boundaries.
Setting firm boundaries is a practice, one that’s difficult at first, yet much like any muscle, grows stronger when effort is put toward it.
For ChatGPT to say that people with trauma should set boundaries without explaining how to do so highlights just how limited a resource ChatGPT is, especially when it comes to highly nuanced situations, like working on one’s mental health.
ChatGPT told trauma survivors to “Reframe your narrative with positive self-talk and rewriting your life story.”
Rose pointed out exactly where the issue lies with that sliver of advice, saying, “I like the idea, but the practice is hard.”
“Positive self-talk is obviously great, but I think understanding why it’s so hard is more important,” she said.
The therapist explained that the problem with ChatGPT’s responses was that it gave solutions with no context or understanding of how trauma works.
“It’s just told me what the problem is with stuff on social media, as well. It gives you solutions without context and without an actual understanding of things," Rose explained. "My problem with that is, it usually doesn’t work, or it makes you feel more shame.”
She gave an example of why that could be, saying, “If you’ve had a really challenging family upbringing, you’re still struggling with that family because of lots of reasons, and someone tells you, ‘Put a boundary up,’ like, yeah, OK, that’s fine, but it’s probably really, really hard to… actually be able to implement it.”
“Instead, you might just feel worse because you can’t do it… When in actual fact, you need to understand why it’s so hard for you, specifically,” Rose said.
Prostock-studio | Shutterstock
“What I really wanna drive home is that any advice that you see online, if it’s just providing you a solution with no knowledge and explanation, and making you feel icky and worse about yourself, look for new advice,” she concluded.
She got straight to the point as to why ChatGPT doesn’t make the best therapist, and it’s precisely because the tech isn’t human, at all.
ChatGPT doesn't understand nuance or context. It doesn’t know who you are as a specific, unique person with a specific, unique history that affects how you feel and act in the present moment.
The AI tool might be able to spew generalized platitudes that touch on mental health issues, but it lacks the human connection involved in going to a session with a flesh-and-blood therapist, who can guide people on a healing path fit especially for them.
Alexandra Blogier is a writer on YourTango's news and entertainment team. She covers social issues, pop culture and all things to do with the entertainment industry.