TikTok algorithm drives teens to dangerous content says new study
TikTok’s algorithms drive young people towards self-harm and eating disorder content within minutes, new research suggests.
According to a study from the Centre for Countering Digital Hate (CCDH), the video-sharing app is rife with dangerous content, including restrictive diets, suicide and self harm.
The campaign group found one youth account was shown suicide content within 2.6 minutes, and another suggested eating disorder within eight minutes.
A TikTok spokesperson said: “This activity and resulting experience does not reflect genuine behaviour or viewing experiences of real people. We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.
It comes after an inquest into the suicide of schoolgirl Molly Russell found she died while suffering from the “negative effects of online content”.
CCDH urged governments to push forward with legislative action to address harmful content
“Without legislation, TikTok is behaving like other social media platforms and is primarily interested in driving engagement while escaping public scrutiny and accountability,” the report said. “The TikTok algorithm does not care whether it is pro-anorexia content or viral dances that drive the user attention they monetize.”
The long-awaited online safety bill is currently making its way through parliament, which aims to better protect youngsters.
The Chinese-owned firm has also come into the spotlight over security concerns, with the US Senate this week passing a bill that bars federal employees from using the app on government devices.