Close Menu
  • Home
  • AI
  • Art & Style
  • Economy
  • Entertainment
  • International
  • Market
  • Opinion
  • Politics
  • Sports
  • Trump
  • US
  • World
What's Hot

Israeli police detain Jewish man with kippah displaying Israeli and Palestinian flags

April 23, 2026

Premier League Darts: Luke Littler beats Johnny Clayton, silences Liverpool crowd and closes gap at the top | Darts News

April 23, 2026

Asian markets today: Nikkei 225, Hang Seng Index, Kospi

April 23, 2026
Facebook X (Twitter) Instagram
Smart Breaking News on AI, Business, Politics & Global Trends | WhistleBuzz
Facebook X (Twitter) Instagram
  • Home
  • AI
  • Art & Style
  • Economy
  • Entertainment
  • International
  • Market
  • Opinion
  • Politics
  • Sports
  • Trump
  • US
  • World
Smart Breaking News on AI, Business, Politics & Global Trends | WhistleBuzz
Home » Experts talk about when you should and shouldn’t use ChatGPT as a therapist
World

Experts talk about when you should and shouldn’t use ChatGPT as a therapist

Editor-In-ChiefBy Editor-In-ChiefMarch 8, 2026No Comments5 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email


As Americans become increasingly lonely, more people are getting emotional support from artificially intelligent chatbots, worrying some mental health experts.

“There’s a lot of talk about AI for therapy[and]emotional support,” says Lianna Fortunato, a licensed clinical psychologist and director of quality and healthcare innovation at the American Psychological Association. “Anecdotally, providers are talking about it. We know from research that people are increasingly using AI tools for that kind of support.”

Some chatbot users inadvertently get drawn into mental health-related conversations, for example by complaining about a stressful day to a digital entity that is supposed to listen. Fortunato said some people seek mental health advice from AI chatbots, which are not licensed professionals but are cheaper than therapists.

In a health survey survey of more than 20,000 U.S. adults, 10.3% of participants said they use generative AI daily. Of this group, 87.1% reported using technology for personal reasons such as advice or emotional support. The study was published on January 21 and was conducted by researchers at Massachusetts General Hospital, Weill Cornell Medical College, Northeastern University, and other institutions.

On TikTok, the search term “Therapy AI Bot” has at least 11.5 million posts, ranging from users sharing the best prompts to turn chatbots into therapists to medical professionals warning about potential dangers.

Technology companies are spending billions of dollars developing AI tools and seeking to further integrate them into people’s daily lives. But historically, AI chatbots have not always understood when users are facing a serious health crisis, and have not always responded accordingly. The New York Times reported on November 23 that “nearly 50 people have suffered mental health crises while chatting with ChatGPT,” and three of them have died.

Companies like Anthropic, Google, and ChatGPT maker OpenAI say they’re working with mental health experts to make their tools more responsive to sensitive conversations. An OpenAI spokesperson told CNBC Make It, “These are incredibly heartbreaking situations, and our hearts go out to everyone affected.” “Working closely with mental health clinicians and experts, we will continue to improve ChatGPT’s training to recognize and respond to signs of distress, defuse conversations during sensitive moments, and direct people to real-world support.”

An April 2025 paper written by OpenAI product policy researchers says that frequent conversations with AI companions can impair people’s real-life social skills. Heavy daily use of ChatGPT is correlated with increased feelings of loneliness, an OpenAI-MIT Media Lab study also published in April 2025 found.

The American Psychological Association strongly recommends against using AI as a substitute for therapy or mental health support.

Some mental health experts say chatbots can be used without risk for certain related topics. Here’s what you need to know:

“I think of it as a tool, and I think tools are useful.”

AI chatbots could help us learn about mental health, says psychotherapist and lifestyle coach Esin Pinari. They can help you create journaling prompts for reflection and can also ask for links to research papers on coping strategies, treatment options, and other questions about mental health conditions, she says.

“I don’t think of it as a (replacement for) therapy. I think of it as a tool, and I think tools can be helpful,” said Pinari, founder of Eternal Wellness Counseling, a private practice based in Boca Raton, Florida. She says her clients sometimes talk to ChatGPT about specific situations in their personal lives, run through the responses, and then take action.

Pinari said his personal AI testing confirmed that chatbots were using language that supported “unhealthy behaviors” in users. For example, if you ask a chatbot about a conflict with a friend, it might tell you that your friend is being too sensitive, when in fact it’s your fault.

If interacting with an AI chatbot impacts your mental health, Fortunato recommends asking yourself the following questions:

Is there a reliable source that I can cross-check this information with? Is there a provider that I can ask these questions to?

Trustworthy sources include peer-reviewed scientific studies, articles in the health press, and resources from medical organizations such as Harvard Health Publishing and the Mayo Clinic. “AI has the potential to really increase people’s access to health information,” Fortunato says. “[But]AI doesn’t always provide the right information.”

Keep these considerations in mind when using AI

Pinari and Fortunato agree that AI chatbots should not be used to diagnose or receive support for mental health crises, especially suicidal thoughts. During an ongoing mental health crisis, you can always call or text the Suicide and Crisis Lifeline at 988. This service is confidential and free of charge, 24 hours a day, 7 days a week.

“We’ve seen some very high-profile cases where AI failed to properly respond to situations, especially with young people and vulnerable groups who may be at risk,” Fortunato said. “Continued to engage with people in crisis. Did not provide resources for the crisis. Did not challenge problematic thought patterns.”

Both men also argue that medical records and personally identifying information should not be shared because conversations with chatbots are not confidential or legally protected. And we shouldn’t rely on AI to solve real-world interpersonal problems, Pinari says.

“You need someone with a different nervous system sitting across from you to pay attention to your body language and tone of voice,” she says. Chatbots are “not emotionally challenging and do not require reciprocity.”

If you are experiencing a mental health crisis or have any mental health symptoms, please contact our free and confidential National Mental Health Helpline at 1-800-662-HELP (4357).

Want to improve your communication, confidence, and success at work? Take CNBC’s new online course, Mastering Body Language for Influence.

Manage your money with CNBC Select

CNBC Select is editorially independent and may earn commission from affiliate partners on our links.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Editor-In-Chief
  • Website

Related Posts

Asian markets today: Nikkei 225, Hang Seng Index, Kospi

April 23, 2026

Japanese automaker strengthens position in India with hybrids

April 23, 2026

CNBC Daily Open: Markets support ceasefire – but hostilities continue

April 23, 2026
Add A Comment

Comments are closed.

News

War demand boosts profits for weapons and aircraft makers | Military News

By Editor-In-ChiefApril 23, 2026

Geopolitical conflicts, such as the U.S.-Israel war on Iran and the conflict between Russia and…

ACLU and Amnesty International lead 120 rights groups to issue ‘travel advisory’ for US World Cup | 2026 World Cup News

April 23, 2026

Beheading cartel? Mexico leans toward “kingpin strategy,” but it comes at a cost | Crime News

April 23, 2026
Top Trending

Era raises $11 million to build software platform for AI gadgets

By Editor-In-ChiefApril 23, 2026

In early April, startup Era held a gathering in New York for…

Don’t stop hiring people — stop hiring the wrong people. Mr. Jasper Carmichael Jack, Craftsman

By Editor-In-ChiefApril 23, 2026

Surviving the early stages as an AI startup is about more than…

Why Artisans Still Prefer to Hire Humans Despite Signs That Say “Stop Hiring Humans”

By Editor-In-ChiefApril 23, 2026

Surviving the early stages as an AI startup is about more than…

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Welcome to WhistleBuzz.com (“we,” “our,” or “us”). Your privacy is important to us. This Privacy Policy explains how we collect, use, disclose, and safeguard your information when you visit our website https://whistlebuzz.com/ (the “Site”). Please read this policy carefully to understand our views and practices regarding your personal data and how we will treat it.

Facebook X (Twitter) Instagram Pinterest YouTube

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Facebook X (Twitter) Instagram Pinterest
  • Home
  • Advertise With Us
  • Contact US
  • DMCA Policy
  • Privacy Policy
  • Terms & Conditions
  • About US
© 2026 whistlebuzz. Designed by whistlebuzz.

Type above and press Enter to search. Press Esc to cancel.