Tiktok will allow experts and policymakers to see how it moderates content on its platform in the tech firm’s latest effort to allay safety concerns.
The Chinese-owned viral video app said it will open a European Transparency and Accountability Centre, allowing outsiders to see “first-hand” how it runs moderation and safety practices.
The centre will initially open virtually, with a physical site set to open in Ireland next year.
The move marks the latest effort by Tiktok, which is owned by Chinese tech giant Bytedance, to ramp up its safety measures amid growing concerns over harmful material and its use of data.
It comes a week after a class action lawsuit was filed against the company over allegations it illegally harvested data belonging to millions of European children.
The case, which is being backed by former children’s commissioner for England Anne Longfield, is seeking billions of pounds in damages. Tiktok said the claims “lack merit”, adding it would “vigorously defend” the lawsuit.
In 2019 the app was handed a record $5.7m (£4.2m) fine by US authorities over accusations it illegally collected information from children under 13. It has been banned in India and the UK data watchdog is currently carrying out an investigation.
Tiktok today said its new centre, which follows the launch of a similar project in the US last year, would demonstrate how it uses both technology and human moderators to enforce its guidelines.
“With more than 100m users across Europe, we recognise our responsibility to gain the trust of our community and the broader public,” said Cormac Keenan, head of trust and safety at Tiktok.
“Our Transparency and Accountability Centre is the next step in our journey to help people better understand the teams, processes, and technology we have to help keep Tiktok a place for joy, creativity, and fun.”