The UK’s communications regulator has given online platforms in sectors such as social media, search and gaming until 24 July to comply with new rules aimed at improving child safety online. Ofcom’s “transformational new protections”, finalised today, are designed to prevent children from encountering harmful content relating to topics such as self-harm, eating disorders and […]
The UK’s communications regulator has given online platforms in sectors such as social media, search and gaming until 24 July to comply with new rules aimed at improving child safety online.
Ofcom’s “transformational new protections”, finalised today, are designed to prevent children from encountering harmful content relating to topics such as self-harm, eating disorders and pornography.
They also aim to shield young users from violent, misogynistic and hateful material.
“These changes are a reset for children online,” said Melanie Dawes, Ofcom’s chief executive.
“They will mean safer social media feeds with less harmful and dangerous content, protection from being contacted by strangers, and effective age checks on adult content.”
Measures include requiring social media firms to configure their algorithms to filter out harmful content. Platforms unsuitable for young children must implement robust age verification – and if they cannot, they must ensure their services are appropriate for all ages.
Sites are also expected to have systems in place to review, identify and address harmful content once it comes to their attention. They must also offer better support to help children avoid and prevent exposure to such material.
In addition, services must appoint a named individual accountable for children’s safety, and a senior body must conduct an annual review of how risks to children are managed.
“Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act, they will face enforcement,” Dawes added.
If platforms fail to comply by the deadline, Ofcom has the authority to impose fines and, in serious cases, seek a court order to restrict the site or app’s availability in the UK.
Commenting on the announcement, Lina Ghazal, head of regulatory and public affairs at Verifymy, said: “This will feel like a rip-off-the-plaster moment for thousands of platforms… but these changes have been a long time coming.” She added that a “huge spring clean” is now underway, and by summer, children in the UK should have “safer and more age-appropriate experiences online.”
Ghazal also stressed that protecting children was “not a one-time fix,” but an ongoing effort as platforms must “assess risks, monitor safety measures, and adapt to emerging threats.”