Social media sites face financial penalties or blocking if they fail to tackle “online harms”, says a government white paper published today.
Under a code of practice set out by the Department for Culture, Media and Sport (DCMS), an independent watchdog would police the sector, possibly funded by the industry.
The white paper targets the proliferation of terrorist content, child sex abuse, so-called revenge porn, hate crimes, harassment and fake news.
Under the proposal, senior managers would be held responsible for breaches of the code.
Digital, Culture, Media and Sport (DCMS) secretary Jeremy Wright said it is time for mandatory regulations tech firms must follow.
“Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough,” he said.
Home secretary Sajid Javid called on social media firms to “protect the young people they profit from”.
“Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism – is still too readily available online,” he added.
Whether the government hands powers to an existing regulator or creates a new one, the body would be able to fine companies and name and shame those that do not adhere to the code of practice.
Fines for company executives are also being considered, according to the BBC, which said ministers believe penalties and warnings notices for companies will be outlined in a future bill.
DCMS committee chair Damian Collins welcomed the proposal, adding: “There is an urgent need for this new regulatory body to be established as soon as possible.
“It is also important that the regulator has the power to initiate its own investigations into the social media companies when it is clear that they have failed to meet their duty of care to their users.
“This should include the power to discover why effective action was not taken, and who knew what and when within the company about it.
“The regulator cannot rely on self-reporting by the companies. In a case like that of the Christchurch terrorist attack for example, a regulator should have the power to investigate how content of that atrocity was shared and why more was not done to stop it sooner.”