Posted on Sat, 26 October 2024
The family of a 14-year-old teenager from Florida, United States, claims that he took his own life after "falling in love" with an artificial intelligence profile that posed as one of the characters from the TV series Game of Thrones. The boy's mother, Megan Garcia, has filed a lawsuit to hold the chatbot Character.AI responsible.
According to her account, her son Sewell began using the service in April 2023, after turning 14. After that, she says, the teenager was never the same, becoming withdrawn, lazy about participating in class activities, and even quitting the Junior Varsity basketball team he was a part of.
In November of last year, Sewell was taken to a therapist, who diagnosed him with anxiety and disruptive mood dysregulation disorder. Unaware that the teenager was "addicted" to the Character.AI chatbot, the professional recommended that he spend less time on the internet and social media.
Months later, in February of this year, the young man got into trouble at school. He even responded to one of the teachers, saying he wanted to be expelled. On the same day, according to the lawsuit, he wrote in his own diary that he was suffering and couldn’t stop thinking about Daenerys Targaryen, the character from Game of Thrones with whom he "conversed" in the artificial intelligence chat. According to the teenager's mother, the chat with the character’s photo was the last interaction Sewell had before taking his life. Due to the incident at school, which occurred on February 28, Megan Garcia confiscated her son’s phone but returned it a few days later. As soon as he got the device back, Sewell reportedly went to the bathroom and sent messages to the AI profile.