Meta has been ordered to pay $375m, around £279m, after a US jury found it misled families about the safety of its platforms for children.

Ad

The landmark ruling, delivered in New Mexico, is being described as the first time a state has successfully sued the social media giant over child safety issues. For parents raising children in an always online world, the verdict adds fresh scrutiny to how platforms like Facebook and Instagram protect young users.

Why has Meta been fined?

A jury found that Meta, which owns Facebook, Instagram and WhatsApp, violated New Mexico's Unfair Practices Act by misleading the public about how safe its platforms were for children.

The court heard claims that Meta’s platforms endangered young users and exposed them to sexually explicit material and contact with sexual predators.

New Mexico Attorney General Raul Torrez called the verdict “historic” and said it marked the first successful state lawsuit against Meta over child safety.

“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew,” Torrez said.

“Today the jury joined families, educators, and child safety experts in saying enough is enough.”

The $375m civil penalty was calculated after jurors concluded there were thousands of violations of the act, each carrying a maximum penalty of $5,000.

How did Meta respond?

Meta said it disagrees with the decision and plans to appeal.

A spokeswoman for the company said: “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors and harmful content. We remain confident in our record of protecting teens online.”

The company has previously said it has “made real changes to protect teens online”, including introducing Teen Accounts with built in protections and tools for parents.

What were the allegations about?

New Mexico first filed the lawsuit in 2022. It claimed Meta “steered” young users towards sexually explicit content, material showing child sexual abuse, and even content involving solicitation and sex trafficking.

According to the state, this happened through recommendation algorithms, the automated systems that decide what users see in their feeds.

Internal research previously reported by the BBC suggests Meta was aware of wider issues linked to engagement driven algorithms. One internal study said: “Given the disproportionate engagement, our algorithms presume that users like that content and want more of it.”

Whistleblowers have also claimed that during the race to compete with TikTok, more “borderline” harmful content was allowed to circulate more widely. Meta has denied deliberately amplifying harmful content for financial gain, saying: “Any suggestion that we deliberately amplify harmful content for financial gain is wrong.”

The New Mexico ruling comes as Meta faces mounting legal pressure elsewhere in the US.

In Los Angeles, a separate trial is under way in which a young woman claims she became addicted to platforms including Instagram as a child because of how they were designed.

Giving evidence, she told the court: “I stopped engaging with my family because I was spending all my time on social media.”

When asked if her life would have been better had she never used platforms like Instagram, she replied: “Yes.”

Meta chief executive Mark Zuckerberg appeared in court in that case and said: “It's been our consistent policy that they're not allowed and we try to remove them. We're not perfect.”

Thousands of similar lawsuits are currently moving through US courts, brought by families who say their children were harmed by social media use.

What does this mean for parents?

While this case was heard in the US, it will resonate with parents globally who are concerned about what their children are seeing online.

The verdict does not change how platforms operate in the UK overnight. However, it adds to growing international pressure on tech companies to prove they are doing enough to protect children from harmful content, sexual exploitation and inappropriate contact.

For mums navigating screen time, group chats and the pull of social media, the case is another reminder to keep conversations about online safety open at home. Features like parental controls, private accounts and age appropriate settings can help, but so can regular check ins about what children are watching, sharing and experiencing online.

Ad

Meta has said it will appeal the ruling, meaning the legal battle is far from over. But for now, the $375m penalty sends a clear message that child safety online remains firmly in the spotlight.

Ad
Ad
Ad