Let’s be honest, Facebook has got away with it.
Since 2014, it has had its hands in the cookie jar, taking our personal data and passing it on to companies like Cambridge Analytica, which could then do whatever they wanted with it. There were no checks, no balances, and no regulation to Facebook’s corporate actions.
Then Facebook got caught, and admitted it. I’m being rather generous to describe Mark Zuckerberg’s public appearances in response as in any way adequate.
This week, the UK Information Commissioner’s Office (ICO) announced that it was intending to impose a fine on Facebook of £500,000: the maximum allowed.
It sounds like a big sum to you or me, but let’s put it into perspective, shall we? In 2017, the EU fined Facebook £95m for providing “incorrect or misleading” information during its purchase of messaging service WhatsApp in 2014. Just saying.
Memories are short these days, and I am sure that those who log on to the social media platform daily to check their news feed and share family photographs have resigned themselves to the reality that they have very few other options for their current virtual connections.
But data protection is just the tip of the iceberg when it comes to how social media giants operate. The separate but linked problem of fake news reveals how little these companies care about corporate responsibility, and the impact they are having on our society.
The fight against fake news is constant. Fake accounts, bots and hackers are using Facebook and other social media sites as vehicles to influence how we think, what we say, and how we vote.
The outrage over the Cambridge Analytica scandal, in which user data was passed on to third parties without permission for use in a political campaign, was heightened by the revelations that these sites have become a hotbed for fake news and misleading information.
Let’s take an example that has nothing to do with Cambridge Analytica, or Facebook for that matter.
Since early June 2018, several UK-based human rights accounts have been spoofed to feed disinformation to the British public. Unknown influence professionals have launched a coordinated effort to establish five duplicate human rights Twitter feeds and at least one website. They have mirrored and modified legitimate accounts to obfuscate the truth.
Why? To spread disinformation. And what has Twitter done about it? Nothing.
Unless the tech giants are forced to monitor and shut down these types of accounts, we will continue to be fed disinformation, and our online world will go to hell in a handcart.
It’s time to change the behaviour of the online giants by ensuring that corporate responsibility becomes a genuine part of their belief and behaviour. Those Facebook adverts that have popped up all over London, saying that the site is really now totally committed to removing “fake news”, won’t cut it.
Quite simply, we must start to see corporate responsibility as an institutional value – and the social media giants are blowing it.
Integrity is something you should always have in business, not just bolt on when you are caught doing something bad. This Facebook fine should be warning that the tech firms need to get their house in order. But don’t hold your breath.