Sex is a big market for the AI industry. ChatGPT won’t be the first to try to profit from it

ChatGPT will be able to have kinkier conversations after OpenAI CEO Sam Altman announced the artificial intelligence company will soon allow its chatbot to engage in “erotica for verified adults.”

OpenAI won’t be the first to try to profit from sexualized AI. Sexual content was a top draw for AI tools almost as soon as the boom in AI-generated imagery and words erupted in 2022.

But the companies that were early to embrace mature AI also encountered legal and societal minefields and harmful abuse as a growing number of people have turned to the technology for companionship or titillation.

Will a sexier ChatGPT be different? After three years of largely banning mature content, Altman said Wednesday that his company is “not the elected moral police of the world” and ready to allow “more user freedom for adults” at the same time as it sets new limits for teens.

“In the same way that society differentiates other appropriate boundaries (R-rated movies, for example) we want to do a similar thing here,” Altman wrote on social media platform X, whose owner, Elon Musk, has also introduced an animated AI character that flirts with paid subscribers.

For now, unlike Musk’s Grok chatbot, paid subscriptions to ChatGPT are mostly pitched for professional use. But letting the chatbot become a friend or romantic partner could be another way for the world’s most valuable startup, which is losing more money than it makes, to turn a profit that could justify its $500 billion valuation.

“They’re not really earning much through subscriptions so having erotic content will bring them quick money,” said Zilan Qian, a fellow at Oxford University’s China Policy Lab who has studied the popularity of dating-based chatbots in the U.S. and China.

There are already about 29 million active users of AI chatbots designed specifically for romantic or sexual bonding, and that’s not counting people who use conventional chatbots in that way, according to research published by Qian earlier this month.

It also doesn’t include users of Character.AI, which is fighting a lawsuit that alleges a chatbot modeled after “Game of Thrones” character Daenerys Targaryen formed a sexually abusive relationship with a 14-year-old boy and pushed him to kill himself. OpenAI is facing a lawsuit from the family of a 16-year-old ChatGPT user who died by suicide in April.

Qian said she worries about the toll on real-world relationships when mainstream chatbots, already prone to sycophancy, are primed for 24-hour availability serving sexually explicit content.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *