Social media platforms must do more to protect children from harmful material on their platforms, the children’s commissioner has said.
Anne Longfield today penned an open letter to social media giants urging them to take legal responsibility for the safety of children on their sites.
The letter calls for the internet firms to back a statutory duty of care and to finance a digital ombudsman, which would act as an independent regulator for young users.
“The recent tragic cases of young people who had accessed and drawn from sites that post deeply troubling content around suicide and self-harm, and who in the end took their own lives, should be a moment of reflection,” wrote Longfield.
“I would appeal to you to accept there are problems and to commit to tackling them – or admit publicly that you are unable to.”
The moves comes amid growing pressure on Facebook, which also owns photo sharing app Instagram, to better police the content on its site to protect children.
Facebook’s head of communications Sir Nick Clegg has pledged to improve the company’s strategy for protecting children on its social media platforms.
The former deputy prime minister told the BBC the firm will do “whatever its takes” to make young people safer on Facebook and Instagram.
MPs have called for greater oversight by the tech giant after it emerged 14-year-old Molly Russell committed suicide in 2017 after viewing images relating to self-harm on social media.
Ian Russell, Molly’s father, has said he believes Instagram is partly to blame for his daughter’s death. Health secretary Matt Hancock has warned the sites could face a ban if they fail to remove harmful content.
Andy Burrows, associate head of child safety online at the NSPCC, said: “It is good to see that the children’s commissioner is backing the NSPCC’s proposal for a statutory duty of care.
“Social networks have repeatedly shown they are incapable of regulating themselves and that they need to be forced to protect children.”