MPs slam Online Safety Act as ‘not up to scratch’
The UK’s Online Safety Act (OSA) is failing to protect the public from the spread of harmful misinformation and must be urgently overhauled, MPs have warned in a damning new report.
In findings published Tuesday, Parliament’s Science, Innovation and Technology Committee (SITC) said the OSA – which began to take effect earlier this year – contains “major holes”, and is ill-equipped to deal with legal but harmful content amplified by social media algorithms.
While welcoming the legislation as a first step, the committee said the law has quickly become outdated in the face of fast-evolving technologies and business models.
MPs warned that the UK risks a repeat of the 2024 riots, which were partly fuelled by viral disinformation, unless stronger safeguards are introduced.
“Social media can undoubtedly be a force for good, but it has a dark side,” said committee chair Dame Chi Onwurah MP. “It’s clear the Online Safety Act just isn’t up to scratch.”
MPs argue that social platforms are not neutral hosts but actively boost content through engagement-driven algorithms, often prioritising reach over accuracy.
The report urges government to impose new duties on tech firms to de-prioritise content flagged by independent fact-checkers and to open their recommendation systems to independent scrutiny.
Regulation fails to keep pace with AI
The report also highlights a growing regulatory gap in digital advertising, warning that false or harmful content is still being monetised with little oversight.
MPs said social media business models, which reward virality, have created an ecosystem where misinformation spreads rapidly – and profitably.
At the same time, generative AI is creating new risks that the OSA does not adequately address. The committee called for urgent legislation to bring AI-generated content under regulatory control, and criticised confusion between regulators and the government over who is responsible for online harms stemming from emerging technologies.
The government has described the OSA as the foundation of a safer internet. But experts have warned that compliance could hit smaller UK-based platforms hardest, with some potentially exiting the market altogether.
Ben Packer, partner at Linklaters, said that new obligations could prove “unsustainable” for firms without deep pockets or existing moderation infrastructure.
The committee also pointed to Ofcom research showing that many users struggle to tell fact from fiction online. In May, the regulator said an “information overload”, fuelled by AI content and confirmation bias, was leaving users vulnerable to scams, conspiracies and misinformation.
“Social media companies actively curate what you see online, and they must be held accountable,” Onwurah said. “To create a stronger online safety regime, we urge the government to go further – from regulating AI to setting clearer standards for digital platforms.”
The SITC said it would continue to scrutinise the government’s progress and warned that without stronger action, the UK public, particularly young people, would remain exposed to unchecked online harm.