Facebook has been under fire for a number of dramatic failings. But as lawmakers were focused on Zuckerberg’s empire, another social media time-bomb has been quietly ticking away in the background away from scrutiny: TikTok.
Kids as young as nine were fed Covid-19 and vaccine misinformation within 35 minutes of signing up to TikTok, despite not searching for content or following any accounts, according to a NewsGuard investigation.
A 13 year old was shown a video which said the vaccines kill people; a 15 year old saw a video which claimed there was no scientific basis for vaccine roll-outs; and another teen was shown a video of a known conspiracy theorist claiming the vaccines are fake. None of these videos were fact-checked.
TikTok’s appeal comes from its hyper-sensitive algorithm which feeds users content they can’t resist. By design, the app sends you down rabbit-holes. That can be funny dances and cute puppies – or anti-vax content and conspiracy theories.
Although TikTok say they’ve introduced a number of measures to fight misinformation, they simply don’t seem to be working.
As vaccinations are being rolled out to 12 to 15 year olds, parents would do well to keep a closer eye on the misinformation free-for-all. Lawmakers should ensure upcoming Online Safety legislation requires platforms like TikTok to be more transparent and empower users with information about who is feeding them the “news”.