New Mexico Court Fines Meta $375 Million Over Child Safety Violations

Total Views : 6
Zoom In Zoom Out Read Later Print

A New Mexico court has ordered Meta to pay $375 million after a jury found the company misled users about child safety, exposing minors to sexual content and predators. Meta plans to appeal, while the case highlights growing legal scrutiny over social media’s impact on children.

A court in New Mexico has ordered Meta, the parent company of Facebook, Instagram, and WhatsApp, to pay $375 million (£279 million) after a jury found the company misled users about the safety of its platforms for children. The case focused on how Meta exposed minors to sexually explicit content and potential contact with sexual predators, holding the company accountable for failing to adequately protect young users online.
New Mexico Attorney General Raul Torrez described the verdict as “historic,” marking the first time a state has successfully sued Meta over child safety issues. In response, a Meta spokeswoman said the company disagrees with the verdict and intends to appeal. She emphasized that Meta works to protect users and has been transparent about the challenges of identifying and removing harmful content and bad actors from its platforms.
The jury found that Meta violated New Mexico's Unfair Practices Act by misleading the public about the safety of its platforms for children. During the seven-week trial, jurors reviewed internal Meta documents and heard testimony from former employees who confirmed that the company was aware of child predators using its platforms. Arturo Béjar, a former Meta engineering leader who became a whistleblower, testified that experiments on Instagram showed underage users being served sexualized content. Béjar said his own young daughter had been propositioned for sex by a stranger on Instagram. Prosecutors also presented internal research showing that 16% of Instagram users reported seeing unwanted nudity or sexual activity in a single week.
Meta defended its record, arguing that it has implemented measures to promote safer experiences for minors over the years. In 2024, Instagram launched Teen Accounts to give young users more control over their experience. Last month, Instagram introduced a feature to alert parents if their children were searching for self-harm content.
The $375 million civil penalty was calculated based on thousands of violations of the Unfair Practices Act, with each violation carrying a maximum penalty of $5,000. Meta is also facing a separate trial in Los Angeles, where a young woman claims she became addicted to platforms like Instagram and YouTube due to intentional design features. Across the United States, thousands of similar lawsuits are progressing through the courts.
New Mexico filed its lawsuit in 2023, alleging that Meta “steered” young users to sexually explicit content, child sexual abuse material, and even solicitations of such content and sex trafficking. The state argued that this occurred through Meta’s recommendation algorithms, which automatically determine the content a user sees. Attorney General Torrez said, “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.”
The ruling signals a significant moment in ongoing debates over social media regulation, corporate responsibility, and online child safety, highlighting the legal and ethical scrutiny facing major technology companies.