Why I will never use ChatGPT as a therapist
Aside from privacy and ecological concerns and its yet-to-be-determined effectiveness, nothing beats human connection
By Anis Nabilah Azlee -
It seems AI can do it all these days; create laundry and to-do lists, “help” students with their school work, and even become your therapist.
On TikTok, there’s a growing number of users who rave about how seen and understood by AI chatbots they feel, sharing that ChatGPT and other AI software have guided them through difficult times and helped a lot with self-reflection.
As a matter of fact, people are even sharing ways to hack AI chatbots by using specific prompts in order to get the best possible advice and response.
But others, including myself, are not as convinced.
Here, user @catgpt questions the effectiveness of therapy rendered by AI given that it will change its responses until you’re satisfied with the outcome, while a therapist with lived experience and clinical background won’t. She posits that AI “therapy” may leave you with useful information, but it’s not necessarily advice that’s good or suitable for your mental health.
With the safety, effectiveness, and environmental impact of AI therapy still being up for debate, and till we get definitive answers, here’s why I won’t be turning to generative AI for mental health help any time soon.
Understanding its appeal
TikTok’s favourite AI therapist, ChatGPT, is a Large Language Model (LLM), which means it generates various types of text in response to prompts. Using learned patterns, such as context from previous user inputs, ChatGPT can provide individualised and useful answers, leading proponents of ChatGPT therapy to claim that it makes them feel heard and understood.
Additionally, many AI chatbots offer users the option to fully customise the bot they interact with. For instance, character.ai allows users to choose their chatbot’s voice, personality, and appearance. Essentially, users can personalise their “therapist” to ensure a comforting and engaging experience, whether they prefer ranting to their favourite celebrity or even simulating a conversation with an entirely new entity (who doesn’t exist in real life).
There’s even a chatbot called Pi AI that’s marketed as the world’s first emotionally intelligent AI. Unlike other platforms, Pi AI encourages users to get personal and reflective through their prompts like “Get philosophical” or “Shift your perspective”.
Thus, beyond being customisable, AI chatbots are available to talk to and process feelings 24/7 and, most importantly, are free to use.
These aspects make them highly appealing, especially for people who don’t want to burden their friends with their issues or overshare, might not be able to afford or access therapy or just need a listening ear.
Not a real therapist
While I acknowledge the merits of a free and always-available mental health resource, I don’t think ChatGPT can ever produce the kind of results you’d get from long-term work with a trusted and certified therapist, especially if you’re in need of human connection.
Therapy is a highly relational practice, which means the therapist-client relationship is a key factor in its success. Unlike human therapists, chatbots are unable to read your body language and other non-verbal cues (or at least, not yet). It can’t respond to you unless you directly tell its interface that you’re feeling sad or you might be crying, which is awkward to do when you’re already experiencing emotional turmoil.
At its core, ChatGPT requires you to verbalise your emotions through text, which just isn’t an option sometimes. Occasionally, the situation calls for you to just sit with and feel your feelings, letting these heavy emotions pass through you to truly feel a release, instead of intellectualising them.
And while you can insert specific prompts to get ChatGPT to act like a therapist, such as noticing patterns in your feelings, thoughts, and actions, and asking questions to evoke self-reflection, it doesn’t do a good job at picking up nuances in your behaviour and language. A sarcastic remark could be misconstrued as a serious one, and ChatGPT will respond accordingly.
Furthermore, although ChatGPT can sound like it truly understands your concerns, all it’s doing is recognising your input and matching it with information it’s learned from datasets, previous users, and your previous inputs.
It’s great at pulling data and transforming them into digestible information for you, but it lacks the kind of empathy and personal experience a real therapist might have that can take your therapeutic experience to a deeper level, where there’s trust and rapport.
Put simply, ChatGPT tells you what you think you need to hear, because you are controlling the prompts after all, rather than offering what might be truly necessary in the moment, the way a good therapist would anticipate and provide.
For instance, my therapist, who’s grown accustomed to my quirks after having worked together for a while, now knows to prepare tissues for me when I grow silent and look away or shift in my seat a certain way.
While some things I say are met with immediate replies of reassurance from him, he knows when to give me room to vent before inviting me to analyse the situation and brainstorm potential solutions together. And sometimes, we’ll just let my thoughts linger and hang out in silence for a bit before moving on to other topics.
This level of understanding and connection is something that ChatGPT simply cannot recreate. It’s the human touch and ability to read between the lines and be fully present that provides nuanced, personalised care.
Where do my queries go?
Another key feature that ChatGPT therapy lacks is the guarantee of patient-therapist confidentiality. Again, ChatGPT is not a trained therapist and therefore has no concept of the sanctity of this confidentiality.
Whenever you type something into ChatGPT and other similar entities, they save your data in order to learn and improve their models for a better user experience.
But, these AI platforms are vulnerable to bugs, which means your data is at risk of being leaked. In March 2023, ChatGPT was taken offline due to a bug in their open-source library which allowed users to see titles from another active user’s chat history. Yikes.
Companies worldwide have also taken measures to ban the use of ChatGPT in order to secure private information from being accessed by external servers and other users.
As such, I can’t be entirely sure where my information and deepest, darkest secrets will end up, and like these companies, I’m not willing to risk the entire world knowing my business.
The hidden cost of generative AI therapy
While using ChatGPT for therapy may be free for you, something else is paying for it — the planet.
The International Energy Agency (IEA) found that a single ChatGPT query uses 2.9 watt-hours (Wh), about 10 times the amount of electricity as a Google search.
As more users turn to AI to support their mental health, the environmental impact of running the required computational resources to operate them, like data centres, becomes more significant.
In fact, the IEA projects that AI usage will significantly drive an increase in global electricity demands; electricity requirements and usage from data centres worldwide are set to more than double to around 945 terawatt-hours by 2030, which is just slightly more than what Japan consumes today.
While AI might provide potential therapeutic benefits, I don’t think it quite justifies the increase in electricity demands and its resulting strain on our planet.
The bottom line
At the end of the day, it is the LLM’s job to produce output that we perceive as useful. For those in between therapy sessions or looking to supplement professional mental health help, ChatGPT can be a great way to process day-to-day experiences or serve as an outlet to vent about negative emotions and experiences. It can also point you in the direction of other mental health resources.
But for me, its uses stop there — generative AI simply cannot provide me with the specialised care and human empathy I need. A robo-therapist, no matter how advanced, lacks the warmth, genuine compassion, and emotional intelligence I look forward to in my healing process with an expert therapist.
Above all, it’s important to understand your expectations for therapy. If ChatGPT works for you, great! But if you’re like me and you’re seeking more well-rounded care, I’d recommend reining in your usage to avoid an over-reliance on AI therapy and considering a human therapist for a touch of human connection.
If you’re struggling with your mental health and need someone to talk to, these helplines are available.
- Singapore Association for Mental Health: 1800-283-7019
- Samaritans of Singapore (SOS): 1767
- SOS 24-hour CareText: 9151 1767