Sunday 27 January 2019 2:37 pm

Social media firms could face ban if they fail to remove harmful online content, warns Matt Hancock

Social media firms face being banned if they fail to remove harmful content, health secretary Matt Hancock has suggested.

"We are a nation state, parliament is sovereign – as we are discovering in the Brexit process – and we can legislate if we need to. It would be far better to do this in concert with the social media companies but if we think they need to do things they are refusing to do, then we can and we must legislate," he said.

Pressed by the BBC on whether this could mean imposing extra taxes on social media giants or even banning them, Hancock replied: "Ultimately parliament does have that sanction. It's not where I'd like to end up."

His comments follow the death of 14-year old Molly Russell, who took her own life in 2017 after viewing content about suicide on social media. Her father has said Instagram, which is owned by Facebook, "helped kill my daughter".

Facebook has said it is "deeply sorry" about Russell's death.

Yesterday Hancock wrote a letter to social media firms in which he called on them to "purge" harmful online material about suicide following Russell's death.

He said: “I welcome that you have already taken important steps, and developed some capabilities to remove harmful content. But I know you will agree that more action is urgently needed.

“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.

“It is time for internet and social media providers to step up and purge this content once and for all.”

An Instagram spokesperson said: "Our thoughts go out to Molly's family and anyone dealing with the issues raised in this report. Nothing is more important to us than the safety of the people in our community, and we work with experts every day to best understand the ways to keep them safe. 

“We are undertaking a full review of our enforcement policies and technologies around self-harm, suicide and eating disorders. As part of this, we are consulting further with mental health bodies and academics to understand what more we can do to protect and support our community, especially young people. While we undertake this review, we are taking measures aimed at preventing people from finding self-harm related content through search and hashtags.”

Andy Burrows, NSPCC associate head of child safety online said: “Most social media sites say they prohibit material that glorifies self-harm and suicide but time and time again they fail to enforce their own terms and conditions.

“While platforms continue to make their own rules, children continue to be exposed to inappropriate and potentially dangerous content on their favourite sites."