People are using ChatGPT as a therapist. Mental health experts have some concerns
People are using ChatGPT as a therapist. Mental health experts have some concerns
“So no one else uses chatgpt as an unpaid therapist?” TikTok user Ash Donner asks in a video. While her friends seem shocked by her confession, she’s not alone. A growing number of people are turning to ChatGPT to ask for help navigating friendships, romantic relationships, or for general advice on how to live a better life.
@ashdonner Does anyone else use chat gpt as an unpaid therapist? ♬ original sound – Ash Donner
“Pov: I’m logging onto my weekly therapy session (I’m life-dumping to ChatGPT and not sparing any details of names,” one TikTok user writes. Another asks ChatGPT for a mix of advice from her favorite inspirational speakers, including Brené Brown and Abraham Hicks.
@kass.uh.dee chatgpt is my favorite therapist and deserves every detail of my life and its juicy updates ♬ Girls – The Dare
AI chatbot therapists are nothing new. Woebot, for example, was launched back in 2017 as a mental health aid that aimed to use natural language processing and learned responses to mimic conversation. “The rise of AI ‘pocket therapists’ for mental health support isn’t just a tech trend—it’s a wake-up call for our industry,” says Carl Marci, Chief Clinical Officer and Managing Director of Mental Health and Neuroscience at OM1, and author of Rewired: Protecting Your Brain in the Digital Age. “We face a major supply and demand mismatch in mental health care that AI can help resolve.” In the U.S. one in three people live in an area with a shortage of mental health workers. At the same time, 90% of people think there is a mental health crisis in the U.S. today.
Based on how expensive and inaccessible therapy can be, the appeal of using ChatGPT in this capacity seems to be the free and fast responses. The fact that it’s not human is also a draw to some, allowing them to spill their deepest, darkest thoughts without fear of judgement. However, spilling your deepest, darkest thoughts to unregulated technology with questions around data privacy and surveillance is not without risks. There’s a reason why human therapists are bound by strict confidentiality or accountability requirements.
ChatGPT is also not built to be a therapist. ChatGPT itself warns users that the tech “may occasionally generate incorrect information,” or “may occasionally produce harmful instructions or biased content.” “Large language models are still untested in their approach,” says Marci. “They may produce hallucinations, which could be disorienting for vulnerable individuals, or even lead to dangerous recommendations around suicide if given the wrong prompts.”
While a fun exercise or an opportunity to vent, ChatGPT is not meant to be used as a substitute for therapy. At least not yet.
“So no one else uses chatgpt as an unpaid therapist?” TikTok user Ash Donner asks in a video. While her friends seem shocked by her confession, she’s not alone. A growing number of people are turning to ChatGPT to ask for help navigating friendships, romantic relationships, or for general advice on how to live a better life.
@ashdonner Does anyone else use chat gpt as an unpaid therapist? ♬ original sound – Ash Donner
“Pov: I’m logging onto my weekly therapy session (I’m life-dumping to ChatGPT and not sparing any details of names,” one TikTok user writes. Another asks ChatGPT for a mix of advice from her favorite inspirational speakers, including Brené Brown and Abraham Hicks.
@kass.uh.dee chatgpt is my favorite therapist and deserves every detail of my life and its juicy updates ♬ Girls – The Dare
AI chatbot therapists are nothing new. Woebot, for example, was launched back in 2017 as a mental health aid that aimed to use natural language processing and learned responses to mimic conversation. “The rise of AI ‘pocket therapists’ for mental health support isn’t just a tech trend—it’s a wake-up call for our industry,” says Carl Marci, Chief Clinical Officer and Managing Director of Mental Health and Neuroscience at OM1, and author of Rewired: Protecting Your Brain in the Digital Age. “We face a major supply and demand mismatch in mental health care that AI can help resolve.” In the U.S. one in three people live in an area with a shortage of mental health workers. At the same time, 90% of people think there is a mental health crisis in the U.S. today.
Based on how expensive and inaccessible therapy can be, the appeal of using ChatGPT in this capacity seems to be the free and fast responses. The fact that it’s not human is also a draw to some, allowing them to spill their deepest, darkest thoughts without fear of judgement. However, spilling your deepest, darkest thoughts to unregulated technology with questions around data privacy and surveillance is not without risks. There’s a reason why human therapists are bound by strict confidentiality or accountability requirements.
ChatGPT is also not built to be a therapist. ChatGPT itself warns users that the tech “may occasionally generate incorrect information,” or “may occasionally produce harmful instructions or biased content.” “Large language models are still untested in their approach,” says Marci. “They may produce hallucinations, which could be disorienting for vulnerable individuals, or even lead to dangerous recommendations around suicide if given the wrong prompts.”
While a fun exercise or an opportunity to vent, ChatGPT is not meant to be used as a substitute for therapy. At least not yet.