
sales force CEO Marc Benioff said Tuesday that artificial intelligence “needs some regulation,” noting there are several documented suicides related to the technology.
“Something truly terrifying has happened this year: AI models have become suicide coaches,” Benioff told CNBC’s Sara Eisen at the World Economic Forum’s flagship conference in Davos, Switzerland, on Tuesday.
Mr. Benioff’s regulatory calls echo similar demands he made regarding social media at Davos several years ago.
Benioff said in 2018 that social media should be treated like a health issue and that the platforms should be regulated like tobacco, saying, “Social media is addictive and it’s not good for you.”
“Bad things were happening all over the world because social media was completely unregulated, and now it looks like we’re seeing it happen again with artificial intelligence,” he said on Tuesday.
AI regulation in the U.S. has so far lacked clarity, and in the absence of comprehensive guardrails, states have begun enacting their own rules, with California and New York enacting some of the strictest laws.
California Governor Gavin Newsom signed a series of bills in October to address child safety concerns regarding AI and social media. New York Governor Kathy Hochul signed the Responsible AI Safety and Education Act in December, imposing safety and transparency regulations on large-scale AI developers.
President Donald Trump signed an executive order in December aimed at pushing back against what he called “excessive state regulation” and blocking such efforts.
“To win, U.S. AI companies must be free to innovate without burdensome regulations,” the order states.
Benioff was adamant Tuesday that changes to AI regulations are needed.
“The weird thing is, tech companies hate regulation. All but one company hates regulation. They love Section 230, which basically says they’re not responsible,” Benioff said. “So if this large-scale language model led this child to commit suicide, they’re not responsible because of Section 230. Maybe that’s something that needs to be reshaped and changed and changed.”
Section 230 of the Communications Decency Act protects technology companies from liability for user content. Both Republicans and Democrats have expressed concerns about the law.
“Unfortunately, a lot of families have suffered this year, but I don’t think they had to do that,” Benioff said.
If you are having suicidal thoughts or are in distress, please contact the Suicide & Crisis Lifeline (988) for support and assistance from a trained counselor.
