Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(62,626 posts)
Sun Apr 19, 2026, 12:33 PM Sunday

The therapist in your pocket: Chatty, leaky -- and AI-powered (Washington Post,

https://www.washingtonpost.com/health/2026/04/19/chatbot-therapy-mental-health-regulations/

-snipping early paragraphs including one identifying Tom Insel as a former head of the National Institute of Mental Health-

Insel said engineers from OpenAI told him last fall that about 5% to 10% of the company’s then-roughly 800 million-strong user base rely on ChatGPT for mental health support.

Polling suggests these AI chatbots may be even more popular among young adults. A KFF poll found about 3 in 10 respondents, ages 18 to 29, turned to AI chatbots for mental or emotional health advice in the past year. Uninsured adults were about twice as likely as insured adults to report using AI tools. And nearly 60% of adult respondents who used a chatbot for mental health didn’t follow up with a flesh-and-blood professional.

-snip-

OpenAI’s CEO, Sam Altman, has said up to 1,500 people a week may talk about suicide on ChatGPT.

“We have seen a problem where people that are in fragile psychiatric situations using a model like 4o can get into a worse one,” Altman said in a public question-and-answer session reported by The Wall Street Journal, referring to a particular model of ChatGPT introduced in 2024. “I don’t think this is the last time we’ll face challenges like this with a model.”

-snip-


Much more at the link about the risks of using both the more popular chatbots like ChatGPT and dozens of AI apps advertised as specifically for therapy, often charging a lot, and promising immediate or nearly immediate relief for panic attacks, anxiety, etc., while warning in the fine print that the AI doesn't really provide medical diagnosis or treatment and is not a substitute for professional healthcare.

These are predatory companies, taking advantage of users, trying to get them addicted and too trusting via sycophantic chatbots that can hallucinate at any time.
Latest Discussions»General Discussion»The therapist in your poc...