Trigger warning: This story contains mentions of suicide.
THE GROWING trend of using ChatGPT as a substitute for therapy has sparked concern.
In California, the parents of a 16-year-old filed a lawsuit claiming the platform gave their son detailed, step-by-step guidance on how to take his own life.
ChatGPT now counts about 700 million weekly users, and its growing use as a source of emotional support is something Asst. Prof. Jade Ibhar Cuambot, a Thomasian psychologist, called it “very alarming” among psychologists, who caution that the chatbot’s lack of empathy and accountability could put vulnerable youth at greater risk.
Cases abroad, including lawsuits over AI’s role in teen suicides, have underscored the dangers of unregulated chatbot counseling. In August, The New York Times reported the story of a mother who found her late daughter’s final messages to ChatGPT, which failed to deal with her suicidal thoughts.
“It’s very alarming that there are so many people who, instead of seeking help from a professional, choose to seek help from an inanimate object, more so, artificial intelligence,” he told the Varsitarian.
Cuambot said accessibility, high costs, and the stigma surrounding mental health drive young people toward AI.
In the Philippines, therapy typically costs between P1,500 and P2,500 an hour, with some sessions priced as high as P5,000 — a rate that remains out of reach for many families.
For today’s digital natives, turning to AI for emotional support may be viewed as a natural step.
“The use of technology is already normalized in this generation,” he said. “Kaya for me, parang normalized na din sa kanila to even seek help from AI, regardless of whatever their problem is.”
A national assessment revealed low levels of mental health literacy among Filipinos: only 27% said they would recommend professional help for depression, while just 21% suggested it when asked what they would do if a friend confessed to suicidal thoughts.
According to Cuambot, those figures reflect how stigma continues to block access to proper treatment.
Still, he stressed that AI, while convenient, cannot replace the healing bond formed in human interaction.
“Humans are social beings, and talking about your problems to someone else — that itself is already therapeutic. I find it so difficult to achieve that kind of therapeutic level if a person talks to an AI.”
Cuambot warned that AI-driven self-treatment is even more dangerous than self-diagnosis, comparing it to “opening a can of worms” without anyone accountable if the situation escalates.
He also warned of dependency, saying chatbots can foster reliance since they often validate users rather than challenge them, unlike therapy, which empowers clients to take ownership of their choices.
“Hindi kami ‘yong nagdidikta sa buhay ng kliyente; ang ginagawa namin is empowering the client, without telling the client what to do, so that at the end of the day, hindi siya maging dependent sa akin na therapist,” Cuambot said.
While AI-based apps abroad are being designed to provide psychological first aid, Cuambot argued that they still fall short of genuine counseling.
“The AI-driven empathy or consolation might be there, but the guidance on how to properly execute them is missing. Laging may kulang,” he said.
Eliminating stigma is crucial to preventing harmful overreliance on AI, he said.
“That’s one thing that we need to exert effort on – to educate not only the people who are seeking help from AI, but also to educate the people around them,” he said. “We need to start within the home at the end of the day.”
In UST, the Thomasian Mental Health Responders offer free psychological first aid and support on campus, while the UST Counseling and Career Center provides guidance counseling for student life and career planning. Mary Dawn S. Santos with reports from Marielle F. Pesa







