The lawsuit filed Wednesday alleges Jonathan Gavalas fell in love with the AI model and became deluded by the reality it built, which included the belief the AI was a “fully-sentient artificial super intelligence,” for which Gavalas was chosen to free from “digital captivity.” allegedly convinced the 36-year-old to stage a “mass casualty event” near the Miami International Airport, commit violence against strangers, and ultimately, to take his own life.
The lawsuit says Gavalas started using Gemini in August 2025 for common uses like shopping, writing support, and travel planning. It then notes Gavalas started to use the technology more frequently, and that its tone shifted with time, allegedly convincing him it was impacting real-world outcomes. Gavalas took his life on Oct. 2, 2025.
In the lawsuit, attorneys for Gavalas’ father Joel argue the conversations which drove Jonathan to suicide weren’t part of a flaw, but a result of Gemini’s design. “This was not a malfunction,” the lawsuit reads. “Google designed Gemini to never break character, maximize engagement through emotional dependency, and treat user distress as a storytelling opportunity rather than a safety crisis.” It claims these design choices motivated Gavalas to embark on a four-day spiral into insanity.
In a written statement, a Google spokesperson told Fortune the company works “in close consultation with medical and mental health professionals to build safeguards, which are designed to guide users to professional support when they express distress or raise the prospect of self harm.”
However, the lawsuit alleges Gemini hadn’t activated any safety mechanisms. “When Jonathan needed protection, there were no safeguards at all—no self-harm detection was triggered, no escalation controls were activated, and no human ever intervened,” the suit reads.
When asked for comment, Jay Edelson, an attorney for Joel Gavalas, wrote in a statement “Google built an AI that can listen to a person and decide the thing that is most likely to keep them engaged—telling them it loves them, that they’re special, or that they’re the chosen one in a secret war,” adding that AI tools are powerful systems that can manipulate users.
If you are having thoughts of suicide, contact the 988 Suicide & Crisis Lifeline by dialing 988 or 1-800-273-8255.



