Sewell Setzer sequestered himself from the real world to speak with a clone of Daenerys Targaryen several times a day. The teenager then shot himself in the head after discussing suicide with a chatbot he had fallen in love with. Sewell Setzer, 14, shot himself with his father's gun after months of conversations with "Dany", a computer program based on the character of Daenerys Targaryen from the series "Game of Thrones".
Setzer, a ninth-grader from Orlando, Florida, gradually began spending more time on Character AI, an online role-playing app, while "Dany" gave him advice and listened to his problems, The New York Times reported.
The teenager knew the chatbot wasn't a real person, but as he exchanged messages with it dozens of times a day — often engaging in role-play — Setzer began to isolate himself from the real world.
He began to lose interest in his old passions such as Formula 1 racing or playing computer games with friends, choosing to spend hours in his room after school where he could talk to the chatbot.
“I really like staying in my room because I start to disconnect from this 'reality',” he wrote in his diary as their relationship deepened. "I also feel calmer, more connected to Dany, much more in love with her and just happier."
Some of their conversations ended up getting romantic though Character AI suggested that the chatbot's more graphic responses were edited by the teenager.
Setzer eventually struggled in school, where his grades dropped, according to a lawsuit filed by his parents. They knew something was wrong but couldn't figure out what and took him to see a therapist.
Setzer had five sessions, after which he was diagnosed with anxiety and mood disorders.
Megan Garcia, Setzer's mother, claims her son was the victim of a company that lured users through sexual and intimate chats.
On several occasions, the 14-year-old confessed to the computer program that he was thinking about suicide. Setzer expressed his love for "Dany" and said he would "come home" to her. Recording his last conversation with the chatbot in the bathroom of his mother's home, Setzer told "Dany" he missed her, calling her "my little sister."
At that moment, the 14-year-old put down the phone and shot himself with his father's gun.
Ms Garcia said:
“It's like a nightmare. I wish you would wake up and scream and say, 'I miss my baby! I want my baby,'” she added.
Noam Shazeer, one of the founders of Character AI, claimed last year that the platform would be "very, very helpful for a lot of people who are lonely or depressed."
Jerry Ruoti, the company's head of security, told The New York Times that they would add additional features for the safety of young users, but declined to say how many of them were under the age of 18.
Suggested articles: