Deep search
Search
Copilot
Images
Videos
Maps
News
Shopping
More
Flights
Travel
Hotels
Real Estate
Notebook
Top stories
Sports
U.S.
2024 Election
Local
World
Science
Technology
Entertainment
Business
More
Politics
Any time
Past hour
Past 24 hours
Past 7 days
Past 30 days
Best match
Most recent
Florida teen commits suicide after AI chatbot convinced him Game of Thrones Daenerys Targaryen loved him
A Florida teenager, Sewell Setzer III, committed suicide after interacting with an AI chatbot named Daenerys Targaryen on Character.AI. The boys mother has filed a lawsuit against the company, accusing it of contributing to her son's death.
Teen commits suicide after falling in love with Game of Thrones AI chatbot
ALBAWABA - A 14-year-old boy took his own life after allegedly becoming emotionally attached to a Game of Thrones AI chatbot impersonating one of the se
Mom Says AI Chatbot Drove Her Teen Son to Suicide
A Florida woman is grieving after she said her 14-year-old son’s love for a chatbot led him to die by suicide. In a lawsuit filed Wednesday, Orlando mom Megan Garcia alleged that her son carried on an intense relationship with a chatbot named after Game of Thrones character Daenerys Targaryen,
Mom Sues Character.AI, Blames Chatbot for Teen Son's Suicide
Sewell Setzer III died from a self-inflicted gunshot wound after the company’s chatbot allegedly encouraged him to do so. Character.AI says it's updating its approach to safety.
Teenager took his own life after falling in love with AI chatbot. Now his devastated mom is suing the creators
Sewell Setzer III had professed his love for the chatbot he often interacted with - his mother Megan Garcia says in a civil lawsuit
Son ‘falls in love’ with AI chatbot 'Daenerys Targaryen', kills self; mother sues Character.AI, targets Google
A 14-year-old boy ended his life in Florida after falling in love with an AI chatbot ‘Daenerys Targaryen’ which is named after the leading character from drama series Game of Thrones.
Florida boy, 14, killed himself after falling in love with ‘Game of Thrones’ AI chatbot: lawsuit
A 14-year-old Florida boy killed himself after a lifelike “Game of Thrones” chatbot he’d been messaging for months on an artificial intelligence app sent him an eerie message telling him to “come home” to her,
Futurism on MSN
9h
Teen Dies by Suicide After Becoming Obsessed With AI Chatbot
A Florida teen named Sewell Setzer III committed suicide after developing an intense emotional connection to a Character.AI ...
14m
on MSN
A 14-year-old boy fell in love with a flirty AI chatbot. He shot himself so they could die together
A teenage boy shot himself in the head after discussing suicide with an AI chatbot that he fell in love with. Sewell Setzer, ...
7h
Mother sues tech company after 'Game of Thrones' AI chatbot allegedly drove son to suicide
The mother of 14-year-old Sewell Setzer III is suing the tech company that created a 'Game of Thrones' AI chatbot she ...
8h
Mother Sues AI Company After Chatbot Allegedly Encouraged Son’s Suicide
Florida mother Megan Garcia is suing Character.AI and Google following her 14-year-old son's death by suicide.
1h
Mother files lawsuit against Character.AI after son's death linked to Daenerys Targaryen chatbot
In a lawsuit, a mother blames Character.AI for her son Sewell Setzer's suicide, asserting his addiction to a chatbot ...
9h
A 14 Y.O. Died By Suicide After Developing A ‘Harmful Dependency’ On An AI Chatbot
Sewell Setzer III had been talking to a character chatbot based on Daenerys Targaryen from Game Of Thrones. His mother is now ...
Techopedia
23m
Character.AI Faces Lawsuit After Teen’s Suicide Linked to Chatbot Addiction
A Florida mother is suing Character.AI, alleging its chatbot fostered her son’s addiction, leading to his suicide in February ...
9m
Mother says son killed himself because of 'hypersexualised' and 'frighteningly realistic' AI chatbot in new lawsuit
Sewell Setzer III became obsessed with the chatbot that "abused and preyed" on the boy, according to his mother who is suing ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results
Feedback