The coronavirus pandemic has brought about global social distancing and home-working which, in turn, has meant an uptick in the amount of professional and personal data being transferred online and via apps.
Even before the crisis, privacy was dead. Hopefully, the spike in data transfer will ensure governments, educators and big technology companies work together to, finally, fix privacy.
Let’s get one thing straight: privacy is finished. And this has nothing to do with the current pandemic, we have not had real ‘ownership’ of our data for some time.
We share our fingerprints and our facial geometry, our words and deeds and desires. Everything we do is consumed, processed and collated to create a complete understanding of who we are.
As human rights lawyer Susie Alegre pointed out: anyone with sufficient resources has the ability to interfere with what you think. Alegre makes her point in relation to a number of notable referendums and elections: in the face of behavioural micro-targeting, can these results truly be considered “the clear will of the people”?
Those that control the medium, also control the choices people make. Fundamentally, they do so by giving users a nudge here or there, and offering or withholding a choice. Because human beings make decisions based on information received, they’re effectively calling the shots on how you form your opinions.
At heart, the problem is that there’s no ethical framework (or laws) that guide or constrain this. So: what can be done to ensure that you your thoughts and opinions are your own? We need a three-point solution that is driven by educators, law-makers and big tech.
Firstly, people need to be educated. An understanding of how individuals can be manipulated and influenced is of course vital. But, perhaps even more importantly, people need to know why they should care. At present, for every person who worries about their data, there are thousands who are indifferent. The same applies to developers. How can we expect developers to care for the privacy of others, if they never learnt to care for their own? This will enable us to instill an ethical baseline to work from.
Educators also need to get involved: ethics needs to become a part of the courses, classes and systems that produce designers and developers. Ethics only occasionally features as a module in Computer Science courses; and it isn’t something that the legions of self-taught web developers are encouraged to put much thought into.
Secondly, the platform providers and developers of apps and services need to step up to the plate. And in Google’s case, they are: thanks to the prevalence of the Android operating system, Google is the dominant app platform globally, and is already thinking very hard about this because – contrary to popular opinion – the company not only wants but needs a healthy relationship with its users, and recognises that in that relationship, they’ll need to do all the work. Organisations like Google will have a central role in monitoring, enforcing and driving new ethical standards – all the while balancing these with commercial interests. That balance must be found, as it would be impractical and naive to sideline the commercial aspects of ‘big tech development’.
Finally, regulation and litigation has a role to play. Governments have a responsibility to their populations, and need to engage with major companies, and agree a regulatory framework that provides legal boundaries regarding how the software we use is designed and built – a principle of “minimal” data collection, whereby it would be illegal to take more data than a company needs? Forced disclosure of source code and records related to design and development decisions?
These, and more, are somewhere to start. Coupled with existing human rights legislation that already prevents interference in what we think, actual litigation against the most egregious offenders will start establishing some hard legal boundaries.
Educators, legislators and big tech need to coordinate as one. At present, the almost unmourned death of privacy is just a milestone on the road to routine manipulation of individuals and populations.