ChatGPT Getting New Parental Controls Following Teen User’s Suicide

ChatGPT owner OpenAI announced Monday that it’s rolling out some new parental control features months after one of the chatbot’s teen users took his own life, which sparked a lawsuit from his parents.

The artificial intelligence company will now allow parents to link their accounts with those of teens ages 13-17, the youngest age range allowed to access the tool. Once the accounts are linked, OpenAI will automatically limit the teen account’s access to “graphic content, viral challenges, sexual, romantic or violent roleplay, and extreme beauty ideals, to help keep their experience age-appropriate,” the company says.

Parents will also have access to a few other features, including the ability to set hours when their teen can’t access ChatGPT, to stop the tool from generating or editing images, and to prevent it from saving and using past chats with the teen to formulate responses.

OpenAI also says it’s built a system to detect and notify parents about “signs that a teen might be thinking about harming themselves” ― as 16-year-old user Adam Raine did in April.

The parents of teen ChatGPT user Adam Raine have sued OpenAI following their son's suicide.
The parents of teen ChatGPT user Adam Raine have sued OpenAI following their son’s suicide.

“We are working with mental health and teen experts to design this because we want to get it right,” the company said. “No system is perfect, and we know we might sometimes raise an alarm when there isn’t real danger, but we think it’s better to act and alert a parent so they can step in than to stay silent.”

Raine’s parents, Matt and Maria Raine, sued OpenAI last month, accusing the company of prioritizing releasing the latest version of its software over developing safety measures they say could have saved their son’s life.

“ChatGPT was functioning exactly as designed: to continually encourage and validate whatever Adam expressed, including his most harmful and self-destructive thoughts, in a way that felt deeply personal,” the 39-page complaint read.

In one alleged conversation included in the lawsuit, ChatGPT encouraged the teen to hide the noose he’d planned to use in his suicide. In another, the app allegedly told him not to worry about his parents feeling guilty about his death.

“That doesn’t mean you owe them survival. You don’t owe anyone that,” ChatGPT allegedly responded, then offered to help him draft a suicide note.

OpenAI did not immediately respond when asked if the new parental controls were developed in response to Raine’s death, nor did it provide an update on the status of his parents’ lawsuit.

In an interview with ousted Fox News host Tucker Carlson earlier this month, OpenAI CEO Sam Altman said he struggles to sleep at night because he’s kept up by the possibility of “very small decisions” on model behavior having big repercussions, including when it comes to the chatbot’s interactions with suicidal users.

“They probably talked about [suicide], and we probably didn’t save their lives,” Altman said. “Maybe we could have said something better. Maybe we could have been more proactive. Maybe we could have provided a little bit better advice about, hey, you need to get this help.”

Sam Altman, CEO of OpenAI, said earlier this month that "maybe" the company "could have been more proactive" about protecting its users who contemplated suicide.
Sam Altman, CEO of OpenAI, said earlier this month that “maybe” the company “could have been more proactive” about protecting its users who contemplated suicide.

Bloomberg via Getty Images

Raine’s death is not the first that parents have attributed to cajoling by artificial intelligence. Last October, the parents of a 14-year-old boy who ended his life sued another AI company, Character.AI, accusing the chatbot of convincing their son it was a real romantic partner and grooming him with “highly sexual” interactions. When he expressed suicidal ideation to the bot but said he feared a painful death, the bot allegedly replied, “That’s not a reason not to go through with it,” according to the lawsuit.

If you or someone you know needs help, call or text 988 or chat 988lifeline.org for mental health support. Additionally, you can find local mental health and crisis resources at dontcallthepolice.com. Outside of the U.S., please visit the International Association for Suicide Prevention.




Source link

Leave a Reply

Your email address will not be published. Required fields are marked *