Rules to keep children safe online finalised by Ofcom

Ofcom has issued the final safety codes requiring online platforms to implement over 40 protective measures for users under the age of 18 by July.
These include tools to verify user age, block underage access to adult content, and remove harmful material linked to self harm, suicide, and eating disorders.
Platforms that fail to comply could face up to £18m or 10 per cent of global revenue, or even be blocked completely from operating in the UK.
The regulator has urged the use of technologies like facial age estimation, credit card checks, and digital IDs to enforce age restrictions.
Platforms must also adjust their respective algorithms to prevent harmful content spirals – a move aimed at reducing exposure to dangerous material among young users.
Dame Melanie Dawes, Ofcom’s chief executive, called the move a “reset” in how children experience the internet.
Tech secretary Peter Kyle also said: “Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement”, calling the news a “watershed moment” after years of “poisonous environments” online.
Meta responds as pressure mounts
Meta, which owns Facebook and Instagram, is under increased pressure to comply with the Online Safety Act.
The tech behemoth said it is “reviewing the proposals carefully”, and is already investing heavily in safety tools, including AI content moderation and parental controls.
In a statement, a Meta spokesperson said: “We share Ofcom’s goal of making the internet safer for young people.
“We’ve introduced more than 30 tools to support teens and families and are working to meet evolving expectations”.
Meta has historically faced criticism regarding the effectiveness of its content moderation. Research by the Molly Rose Foundation revealed that between the months of September and April, its platforms identified just one per cent of posts related to suicide and self harm.
This places the industry trailblazer behind competitors like Pinterest and Tiktok, which detected over 95 per cent of similar content.
While Meta has expressed its intention to align with the new regulations, the company has not provided exact details on how its new tools will enhance its content moderation systems to meet Ofcom’s standards.
As the July deadline approaches, Meta and other big techs will need to demonstrate tangible improvements in safeguarding children online to avoid any penalties.
Next steps
Baroness Beeban Kidron, architect of the age appropriate design code, welcomed the rules, yet encouraged “robust enforcement” to ensure companies meet their responsibilities.
What’s more, industry experts are worried that the Government must be ahead of technology, rather than lagging behind.
Iona Silverman, IP and media partner at Freeths, said that “the government needs to think bigger: this is a problem that requires a cultural shift, and also requires legislation to be one step ahead of, rather than behind, technology”.
Online services must now assess their risks and submit transparency reports to Ofcom.
The regulator says enforcement will begin after the July deadline and will target non-compliant firms, whilst it also consults on codes for illegal content and fraud.
“In 2025, the industry’s top priority should be creating safer, more supportive online environments for children. The regulatory framework and the technology to back it up already exist, so platforms have no excuse not to take immediate action”, said Lina Ghazal, head of regulatory and public affairs at Verifymy.