
meta We’re back in court, but this time there’s only one defendant.
First arguments in the high-profile case brought by New Mexico begin Monday. metaThe state’s attorney general, Raul Torrez, said the agency failed to protect apps like Facebook and Instagram from online predators targeting child users.
The lawsuit, originally filed in 2023, alleges that Meta “facilitated human trafficking by directing and connecting users, including children, to sexually explicit, exploitative, and child sexual abuse content” in the state.
“What we’re really alleging is that Meta has developed a dangerous product, a product that not only targets children, but allows them to exploit children in virtual spaces and in the real world,” Torrez said Monday on CNBC’s “Squawk Box.”
The case is one of several important Meta cases this year that could have a major impact on Meta and the broader social media industry. Experts say the case is similar to those brought against “Big Tobacco” in the 1990s due to tobacco companies’ efforts to mislead the public about the harm their products could cause to users and their negative effects.
January, Meta, YouTube, TikTok, and snap failed to inform the public about the safety of its social and video streaming apps, despite knowing that the design and certain features were harmful to the mental health of young users. TikTok and Snap settled with the plaintiffs involved in the lawsuit before the trial began.
Opening statements in the Los Angeles trial were scheduled to begin last week but were postponed due to the lead attorney’s unexpected illness. A Mehta spokeswoman said the 18-member jury was presented Friday afternoon and opening arguments will begin Monday. Instagram chief Adam Mosseri is scheduled to testify Wednesday, followed by Meta CEO Mark Zuckerberg a week later.
Many of New Mexico’s allegations against meth stem from covert operations conducted by the attorney general, including the creation of fake social media profiles modeled after 13-year-old girls. Torres previously told CNBC that Dummy’s social media profiles were “simply flooded with images and targeted solicitations, which I found frankly shocking.”
Meta has denied the allegations and said in various statements to the media that the company is “committed to demonstrating our long-standing commitment to supporting young people.”
Torrez told CNBC on Monday that while Meta could face hefty fines, he fundamentally wants the company to change.
“We need real age verification. We need product design changes to avoid linking children to predators on the platform. We need full disclosure in terms of making users aware of the potential harm and danger,” he said.
Although social media companies argue that the content shared on their apps is protected under Section 230 of the Communications Decency Act, the overarching theme of the various lawsuits is that tech companies are allegedly putting young users at risk through the design and features of their apps.
Another trial is scheduled to begin later this year in the Northern District of California. The federal lawsuit involves Meta, TikTok, YouTube, and Snap and centers on allegations that the companies developed flawed apps that resulted in unhealthy and addictive behaviors among teens and children.
Attention: The Magnificent 7 trade is broken and reversed.

