Facebook vs. democracy: Let’s get serious about disinformation
On Monday, Facebook announced new measures to help combat the spread of disinformation in the looming UK General Election.
Plans include working with fact checkers, tracking the origin of political ads, and doing more to take down content that breaks Facebook’s rules.
Read more: The number of deepfake videos online has almost doubled
While positive, these lukewarm efforts do little to reassure that Facebook is taking this issue seriously enough.
They come, after all, in the wake of a European Court of Justice decision this month to allow EU states to remove defamatory content from the site. Facebook denounced the ruling, claiming it would undermine free speech. But given the company’s track record, how seriously should we take its support for democratic values?
Some 18 months ago, Mark Zuckerberg promised that Facebook would do more to combat politically motivated disinformation campaigns. This pledge was spurred after millions of users’ had their data harvested by Cambridge Analytica – with the intent to influence their voting choices.
Yet according to a recent report from Oxford University, no “decisive action” has taken place since, while governments around the world continue carrying out disinformation campaigns for their political goals. And it’s getting worse. The report found that the number of countries that were victims of disinformation campaigns had doubled since 2018, with the majority of these still via Facebook.
While government attempts to influence the public are certainly not new, social media can play a catastrophic role in supercharging this game – as any neuroscientist will tell you.
Instantly sharing news is not driven by rational calculations or critical thinking. We get excited by certain articles, most of which reaffirm our existing notions of the world. The more “likes” we receive, the more dopamine hits we get – ultimately feeding our addiction for external validation.
Modern neuro-imaging tools – which allow for unprecedented visual access into the brain – show that the prefrontal cortex (host to the logical part of the brain) comes second when reading the news. A tweet – much like a rumour – is more likely to be shared if it connects to the emotional side of our brain, the social part.
Facts become subconsciously irrelevant in the spread of news. We’ve seen this play out time and time again, from outdated photos circulating the internet at moments of crisis to viral celebrity death hoaxes.
While there are undoubtedly social, and even political, benefits to easily accessible information, issues arise when we consider the sheer volume of this content. We have not yet adjusted to this age of “information overload”, which exposes our crucial inability to filter fiction from reality.
Neurochemically, we are hardwired to engage with social media. But with tech firms failing to take the necessary steps, it falls to governments to hold them to account.
The recent EU court case showed that countries can force social media platforms to take down posts which break the law. And there are other examples of how government action on misinformation can work. The Czech Republic, Germany and France have begun taking major steps, by actively monitoring social media sites for disinformation campaigns and requiring online platforms to filter fake news.
By banning false stories, governments can safeguard elections and prevent public health scandals (like the debunked but damaging anti-vaccination movement) and conspiracies that threaten to tear society apart.
Read more: 47 US states join Facebook antitrust probe
We cannot rewire our brains to resist the pull of misinformation. Combatting its spread is our only hope. This means that if tech companies are unwilling to be more open and engage with the public sector, governments will have to force them.
It is in the interest of these companies to act – and act more effectively than Facebook’s efforts this week. If they remain unwilling, the voices calling for the break-up of the tech giants will only get louder.