New law helps protect kids from harmful content online - here's what it means for your family
The rollout of the next phase of The Online Safety Act makes social media and search platforms legally responsible for keeping children safer online.

The internet can be a dangerous place for kids, and it’s increasingly harder for parents to make sure their kids are only exposed to age-appropriate content. But new legislation is changing that.
From the 25 July 2025, social media companies and search services are legally required to prevent children from accessing harmful or inappropriate content, such as pornography.
This most recent phase of The Online Safety Act 2023, as it is called, also introduced some new online offences such as intimate image abuse, and has cracked down on illegal content.
In article posted to their site, Ofcom, the UK regulator for communications services which is responsible for online safety, says: “We want children to be able to enjoy the benefits of being online, while reassuring parents that services like websites, social media platforms, games and apps will be more responsible than ever for making sure online spaces are safer for the children who use them.”
“These are just the latest steps in our online safety work, and there’s more to come. We appreciate that these measures won’t solve all of the problems of the online world, but we’re confident that they will help to bring about a major step forward in online safety for children in the UK.”
How the Act aims to improve children’s online safety
Online platforms that are likely to be accessed by children now have to take steps to protect kids from potentially harmful content and behaviour.
This means they are not allowed to access pornography and anything that encourages self-harm, eating disorders, or suicide. And they can only have age-appropriate access to content that involves things like misogynistic and hateful material, bullying, serious violence, dangerous stunts, and harmful substances.
Ofcom has introduced a few rules to help platforms enforce these restrictions, including the following:
- Age checks to prevent children from accessing harmful content on the riskiest services, like social media platforms and pornography sites.
- Social media algorithms are now not allowed to recommend harmful content in children’s feeds.
- Platforms must have easy, accessible ways for parents and children to report problems and harmful content online, and platforms have to respond appropriately.
- All services have to have someone who is responsible for children’s safety and an annual review of their safety measures.
And companies can be fined up to £18 million if they fail to perform these new duties.
What does this mean for parents and children?
The Act doesn’t seek to set a minimum age for kids to use the internet, or ban them from social media completely — those decisions are still up to parents. But it does aim to restrict what children can access to make their online experiences safer.
Sites that host more mature content, like pornography sites and forums like Reddit, have started asking for age verification before allowing visitors to see this content. In practice, this could be facial age verification software, photo ID checks, or a digital identity wallet that proves your age.
Ofcom recommends that kids register on online services with their real age, so that they can only access age-appropriate material. It also recommends parents make sure their children know how to deal with inappropriate or harmful content, by reporting it and not sharing it.
Plus, talking about online safety, and more generally what children do when they are online, can help them feel more confident with the internet and fosters trust between you if they ever have a problem.
Read more:
- Fewer than half of parents use parental controls on their kids’ devices – but this isn't necessarily a bad thing.
- Parents ‘must stop trying to be their child’s friend’ over smartphone use, says children’s commissioner
- Should your child talk to a chatbot? Experts warn AI mental health apps may do more harm than good