Ofcom will be given the power to fine companies up to £18m or 10 per cent of annual global turnover for failing to comply with rules published today in a new draft online safety bill.
The government has published a new draft bill that will see more stringent rules put in place around social media companies in the hope of tackling some of the worst abuse on social media and keeping younger people safe online.
Under the draft proposals, which will be scrutinised by a joint committee of MPs before a final version is formally introduced to parliament, all companies in the bill’s scope will have a duty of care towards their users so that what is unacceptable offline will also be unacceptable online.
Under rules outlines in the bill, social media sites, websites, apps and other services hosting user-generated content, or that allow people to talk to others online, must remove and limit the spread of illegal and harmful content such as child sexual abuse, terrorist material and suicide content.
Social media, websites and apps will also have a greater responsibility to police their content more broadly.
They will have a duty to consider the risks their sites may pose to the youngest and most vulnerable people, and take robust action to tackle illegal abuse, including action against hate crimes, harassment and threats directed at individuals.
The largest social media sites will also need to act on content that is lawful but still harmful, such as abuse that falls below the threshold of a criminal offence, encouragement of self-harm and mis/disinformation.
Ofcom will be given the power to fine companies failing in a new duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher, and have the power to block access to sites.
Home Secretary Priti Patel said: “This new legislation will force tech companies to report online child abuse on their platforms, giving our law enforcement agencies the evidence they need to bring these offenders to justice.
“Ruthless criminals who defraud millions of people and sick individuals who exploit the most vulnerable in our society cannot be allowed to operate unimpeded, and we are unapologetic in going after them.
“It’s time for tech companies to be held to account and to protect the British people from harm. If they fail to do so, they will face penalties.”
Also included in the bill are measures to tackle user-generated fraud.
Online companies will have to take responsibility for tackling fraudulent user-generated content, such as posts on social media, including romance scams and fake investment opportunities posted by users on Facebook groups or sent via Snapchat.
However, fraud via advertising, emails or cloned websites will not be in scope because the bill focuses on harm committed through user-generated content.
Dame Melanie Dawes, Ofcom chief executive, added “Today’s bill takes us a step closer to a world where the benefits of being online, for children and adults, are no longer undermined by harmful content.
“We’ll support Parliament’s scrutiny of the draft bill, and soon say more about how we think this new regime could work in practice – including the approach we’ll take to secure greater accountability from tech platforms.”