Shamima Begum’s radicalisation was our failure to address online extremism and protect our kids
Regardless of who Shamima Begum is today, the fact that a group of British teenage girls could be groomed online for sex and violence in a faraway land, should be taken as a cautionary tale for us all.
Earlier this month, a 16 year old from Cornwall was sentenced for terrorist offences. He was 13 when he first downloaded a bomb-making manual and went on to become the UK leader of a banned Neo-Nazi terrorist group, the Feuerkrig Division. Like Begum, his story is an extreme example, but the way online disinformation and radicalisation work means they are far from unique.
The rise of conspiracy theory groups like QAnon show that age is no barrier to falling victim to disinformation, but our children and teenagers are particularly vulnerable.
Teenage brains are different from adult brains. As the US National Institute of Mental Health puts it, their developing brains mean “teens are more likely to engage in risky behaviour without considering the potential results of their decisions.” The teen brain is full of “plasticity”, giving it the ability to “change, adapt and respond” to its environment.
The wonderfully flexible and nimble quality of teens’ brains is being manipulated by algorithms online. We need law and regulation to protect our young minds from this shameful grooming.
At a time when children and teens are forced, more than ever, to live their lives online, what the digital environment looks like and how it affects them will have a huge impact on all our futures.
The online game Roblox has been a mainstay of the pandemic in my house as in many others. But if your child has never come across narratives of suicide or violence among the neon unicorns, they are lucky.
Roblox offers a chance to interact in an isolated world, but just like the world outside your door, it is fraught with danger. Researchers at Tech against Terrorism have found roleplays of far-right atrocities like the mosque shootings in Christchurch, New Zealand that are readily accessible for the child who is not into Adopt Me or Fashion Famous. The platform that has been a lifeline for kids desperate to play with each other despite pandemic restrictions, is also a recruiting ground for far-right terrorism.
The debate around regulating extremist material and disinformation online has been fraught with concerns on free speech rights and claims of unfairly restricting it. Rightly so. It’s a complicated balance to achieve, but not impossible. So what about the right to freedom of opinion of the person continually bombarded with harmful information, orchestrated by harmful algorithms.
Unlike freedom of expression, the right to form our opinions freely inside our minds is an absolute right in international human rights law.
This includes the right to keep our thoughts and opinions private and free from manipulation, and the right not to be penalised for our opinions, as long as they stay inside our heads. And it relies on access to credible information. Regulating the online ecosystem is not about pulling down every bit of potentially damaging information, it’s about making sure platforms aren’t rigged to show more and more distressing content.
For children and teenages, the unique plasticity of their minds can be used against them. Insights into the way they are feeling can be used to take them on a manipulative online journey.
Monitoring of their online activities can lead to permanent profiling based on fleeting moments of adolescent intellectual risk-taking.
If we want to protect our children and teenagers, and ultimately our societies, from the threat of online manipulation, we need to look not at individual cases and content, but at the global business model that is built on it. Shamima Begum is a symptom of a much wider problem that we cannot afford to ignore.