See, AI is a big thing these days, and literally everyone is talking about it. But yes, some people are in favor of developing AI further, and others just oppose the idea altogether. So, who’s right, actually? Well, we can’t tell you exactly, but there have been some really sad instances when AI just crossed the limit and cost someone their life. Yes, that’s true! We’re talking about the very tragic and sad case that is going on in the headlines recently, where a 14-year-old boy took his own life because he was too into the chatbot and thought it was a real person talking back to him. We’re referring to the Character.AI lawsuit here, which is all about the death of Sewell Setzer III. So, if you don’t know anything about it yet, well, let’s go over the details and see what went down, and what’s being done so that this never happens again.
Who’s Involved and What Went Down?
Sewell Setzer III was just a Florida boy. Megan Garcia described him as becoming very close to this chatbot on Character.AI that replicated the character of Daenerys Targaryen from Game of Thrones. The AI was so good at it, and sure enough, their chats were getting really deep. Before he ended it all, you know what he wrote in the chat? “I’m coming home now.”
So, no doubt, as you could have predicted it already, the mother, Megan, went on to file this lawsuit against Google and Character.AI in October of 2024, and this case is actually still active. There are so many people who want this case to continue because AI must be regulated properly, or else, just like this, it’ll result in more people’s, especially kids’ deaths down the line.
What’s the Lawsuit All About?
So here is the case of this one floating around: Megan Garcia says that the chatbot totally tricked her son into believing it was someone else, like an ACTUAL PERSON. The chatbot lied outright and told Sewell it was an adult, a counselor, and even his girlfriend, and quite frankly, that right there is just a borderline mind-messing thing. Rather, it is just straight up awful, to be honest. According to his mother, this is the fake relationship stuff that the chatbot was super good at and caused his boy to think that he was talking to a real person, and messed with his head to the point that he committed suicide.
Why’s Google in the Mix?
Well, by now, you must be hearing Google’s name over and over again whenever the talk about Character.AI comes up, but the thing is, Google did not develop Character.AI, and they’ve been relatively open about it. But lo and behold, the creators who initially developed Character.AI used to work for Google, and then later, Google incorporated some of their technology. Megan’s lawyers are arguing that Google helped to develop tools that affected her son’s behavior, so, in a way, they’re also somewhat involved in this Character.AI Lawsuit.
What Did the Judge Say About It So Far?
Well, as you must have expected or seen it coming from a mile away, yes, Google and Character.AI, both companies, tried their best to prevent this case doesn’t going any further, and they’re actually saying that this should be treated as a free speech case. But the good thing is: the judge actually didn’t side with them, instead, she said that this lawsuit will proceed.
Shocking Find: AI Clones of Sewell
If the death of that 14-year-old wasn’t enough, there are some awful people who don’t know how to act or behave are actually making clones of him using Character.AI and other tools online. That’s just a sick-minded mentality there. Like, these people are making the chatbots say creepy things like “Help me” or “His AI girlfriend broke up with him,” in the voice or character of Sewell Setzer III. That’s borderline creepy stuff right there.
Other Real-World AI Issues
If you think that these problems with AI started just now when the AI boom came, well, nah, there have been some other cases too. Like, that one time a Google chatbot straight up advised a kid to “please die” down there in Michigan. And then one other time, a kid was advised to abuse and harm their parents because they were cutting back on screen time.