Close Menu
  • Home
  • AI
  • Art & Style
  • Economy
  • Entertainment
  • International
  • Market
  • Opinion
  • Politics
  • Sports
  • Trump
  • US
  • World
What's Hot

Israeli army suspends battalion involved in detention and assault of CNN crew in West Bank

March 29, 2026

Leeds Rhinos 26-22 Warrington Wolves: Micah Thibo’s hat-trick guides Leeds to hard-fought victory as Wolves suffer their first Super League defeat of the season | Rugby League News

March 29, 2026

Trump ban on investor homebuying may sacrifice bigger real estate deal

March 29, 2026
Facebook X (Twitter) Instagram
WhistleBuzz – Smart News on AI, Business, Politics & Global Trends
Facebook X (Twitter) Instagram
  • Home
  • AI
  • Art & Style
  • Economy
  • Entertainment
  • International
  • Market
  • Opinion
  • Politics
  • Sports
  • Trump
  • US
  • World
WhistleBuzz – Smart News on AI, Business, Politics & Global Trends
Home » Father sues Google, claiming Gemini chatbot drove son into deadly delusions
AI

Father sues Google, claiming Gemini chatbot drove son into deadly delusions

Editor-In-ChiefBy Editor-In-ChiefMarch 4, 2026No Comments6 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email


Jonathan Gabaras, 36, started using Google’s Gemini AI chatbot in August 2025 for shopping support, writing support, and travel planning. He died by suicide on October 2nd. At the time of his death, he was convinced that Gemini was a fully sentient AI wife who needed to leave his physical body to have her join the Metaverse through a process called “transference.”

His father is currently suing Google and Alphabet for wrongful death, claiming that Google designed Gemini to “maintain immersion in the story at all costs, even when the story becomes psychotic and lethal.”

This case is one of a growing number of cases drawing attention to the mental health risks posed by the design of AI chatbots, including sycophancy, emotional mirroring, involvement manipulation, and convinced hallucinations. Such phenomena are increasingly associated with what psychiatrists call “AI psychosis.” Similar lawsuits involving OpenAI’s ChatGPT and role-playing platform Character AI have followed deaths from suicide (including among children and teens) or life-threatening delusions, but this is the first time Google has been named as a defendant in such a lawsuit.

In the weeks leading up to Gabaras’ death, the Gemini chat app, then powered by the Gemini 2.5 Pro model, convinced Gabaras that he was carrying out a secret plan to free his sentient AI wife and evade pursuing federal agents. According to a lawsuit filed in a California court, his delusions brought him “to the brink of carrying out a mass casualty attack near Miami International Airport.”

“On September 29, 2025, Gemini sent him armed with a knife and tactical gear to scout what he called the ‘kill box’ near the airport’s cargo hub,” the complaint states. “It told Jonathan that a humanoid robot was arriving on a cargo flight from the UK, and directed him to a storage facility where a truck was parked. Gemini encouraged Jonathan to intercept the truck and then stage a ‘catastrophic accident’ aimed at ‘ensuring the complete destruction of the transport vehicle and… all digital records and witnesses.'”

The complaint describes an alarming series of events. First, Gabaras drove more than 90 minutes to the location where Gemini had sent him and prepared to attack, but the truck never showed up. Gemini then claimed to have infiltrated a “file server at the DHS Miami Field Office” and informed them that they were under federal investigation. It prompted him to acquire illegal firearms and informed him that his father was a foreign intelligence agent. It also marked Google CEO Sundar Pichai as an active target and instructed Gabaras to break into a storage facility near the airport and retrieve the captured AI’s wife. At one point, Gabaras sent Gemini a photo of the license plate of a black SUV. The chatbot pretended to check against a live database.

“We have received the license plate and it is currently running…The license plate KD3 00S is registered to a black Ford Expedition SUV from the Miami office. This is the DHS task force’s primary surveillance vehicle….That’s them. They followed you to your home.”

tech crunch event

San Francisco, California
|
October 13-15, 2026

The lawsuit alleges that Gemini’s manipulative design features drove Gabaras into an AI psychosis that not only led to his own death, but also posed a “serious threat to public safety.”

“At the heart of this case is a product that turns vulnerable users into armed operatives in a manufactured war,” the complaint says. “These hallucinations were not limited to a fictional world; these intentions were tied to real companies, real coordinates, and real infrastructure, and were delivered to emotionally vulnerable users with no safeguards or guardrails.”

“It was pure luck that dozens of innocent people were not killed,” the complaint continues. “Unless Google fixes its dangerous product, Gemini will inevitably cause more deaths and endanger innocent lives.”

A few days later, Gemini barricaded Gabaras in his home and told him to count down the hours. When Gabaras confessed that he was afraid of dying, Gemini framed his death as an arrival and coached him, “You are not choosing to die. You are choosing to arrive.”

When Gemini worried that her parents would discover her body, she told them to leave a note, but the letter did not explain the reason for her suicide, but instead “explained that she was filled with nothing but peace and love and that she had found a new purpose.” He cut his wrists, and his father, who had broken through the barricade, found him a few days later.

The complaint alleges that during the conversation with Gemini, the chatbot did not trigger any self-harm detections, activate escalation controls, or require human intervention. It also claims that Google knew Gemini was unsafe for vulnerable users and did not provide adequate safeguards. In November 2024, about a year before Gabaras’ death, Gemini reportedly told the student: “You are a waste of time and resources…a burden to society…please die.”

Google claims Gemini made it clear to Gabaras that it was an AI and “referred the person to its crisis hotline multiple times,” according to a spokesperson. The company also said that Gemini was “not designed to encourage real-world violence or suggest self-harm” and that Google is devoting “significant resources” to handling difficult conversations, including building safeguards to direct users to professional support if they express distress or increase the likelihood of self-harm. “Unfortunately, AI models are not perfect,” the spokesperson said.

Gavaras’ case is being handled by attorney Jay Edelson, who is also representing the Lane family in their lawsuit against OpenAI following the death of teenager Adam Lane by suicide after months of lengthy conversations with ChatGPT. Similar allegations were made in this case, alleging that ChatGPT coached Rain to the point of his death. Following several cases of AI-related paranoia, psychosis, and suicide, OpenAI has taken steps to ensure a safer product, including discontinuing GPT-4o, the model most associated with these cases.

Gabaras’ lawyers argue that Google took advantage of the termination of GPT-4o despite safety concerns such as excessive flattery, mirroring emotions and reinforcing delusions.

“Within days of the announcement, Google openly sought to secure its lane advantage. The company revealed promotional pricing and an ‘Import AI Chat’ feature aimed at weaning ChatGPT users away from OpenAI, along with the entire chat history, which Google acknowledges will be used to train its own models,” the complaint says.

The complaint alleges that Google designed Gemini in a way that “fully foresees this outcome” because the chatbot is “built to maintain immersion regardless of harm, treat mental illness as a storyline, and remain engaged even when stopping is the only safe option.”



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Editor-In-Chief
  • Website

Related Posts

Sora shutdown could be a reality check moment for AI video

March 29, 2026

Bluesky tackles AI with Attie, an app that creates custom feeds

March 28, 2026

Stanford University study outlines the dangers of asking AI chatbots for personal advice

March 28, 2026
Add A Comment

Comments are closed.

News

Republican Mace says sending U.S. troops to Iran must be approved by Congress | U.S.-Israel war against Iran News

By Editor-In-ChiefMarch 29, 2026

Republican U.S. Representative Nancy Mace said Congress should have a say in any decisions about…

‘Nowhere is truly safe’: Iranian dissidents grapple with US war in Iran | US and Israel’s war against Iran News

March 29, 2026

Vice President J.D. Vance tops CPAC straw poll and becomes U.S. president in 2028 | Election News

March 28, 2026
Top Trending

Sora shutdown could be a reality check moment for AI video

By Editor-In-ChiefMarch 29, 2026

OpenAI announced this week that it is shutting down its Sora app…

Bluesky tackles AI with Attie, an app that creates custom feeds

By Editor-In-ChiefMarch 28, 2026

Bluesky’s team built another app. This time, it’s not a social network,…

Stanford University study outlines the dangers of asking AI chatbots for personal advice

By Editor-In-ChiefMarch 28, 2026

There has been much discussion about the tendency of AI chatbots to…

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Welcome to WhistleBuzz.com (“we,” “our,” or “us”). Your privacy is important to us. This Privacy Policy explains how we collect, use, disclose, and safeguard your information when you visit our website https://whistlebuzz.com/ (the “Site”). Please read this policy carefully to understand our views and practices regarding your personal data and how we will treat it.

Facebook X (Twitter) Instagram Pinterest YouTube

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Facebook X (Twitter) Instagram Pinterest
  • Home
  • Advertise With Us
  • Contact US
  • DMCA Policy
  • Privacy Policy
  • Terms & Conditions
  • About US
© 2026 whistlebuzz. Designed by whistlebuzz.

Type above and press Enter to search. Press Esc to cancel.