The way information is being collected and sold on the internet is creating stark new power imbalances in our societies.
The Cambridge Analytica revelations remind us how easy it now is for unaccountable actors to capture personal data, and abuse that power in ways that could have a profound influence on the outcomes of democratic processes.
So amid calls to tighten privacy settings, or even #DeleteFacebook entirely, what happens next? How can this power be meaningfully reclaimed so that people have control? And how should we encourage more responsible innovation with personal data?
Here are four possible ways forward.
First, we could look to the growing list of tools that help people manage, and benefit more directly from, their digital footprints. Apps like Hub of All Things, Digi.me and Meeco let people collect and blend their data from different online applications, giving new, layered insights into personal habits and behaviours. People can set terms by which companies access their data – say in exchange for a small fee, making individuals more active and powerful stakeholders in the data marketplace.
Second, we can scale this model towards more democratic forms of data governance. Individuals by themselves don’t have much bargaining power over how their data is used (which is why most people just click “I agree” when faced with draconian T&Cs). Data needs to be aggregated and processed in large volumes to yield significant rewards, and new platforms are enabling groups of people to leverage their data’s collective value.
Examples like Midata.coop and Salus.coop create “commons” for people’s health data, embedding transparency and participation over how decisions with sensitive information are made. Patients gain collective influence by pooling data, creating a valuable resource which pharmaceutical companies can access, but only on specific conditions (such as openness about results of medical trials).
Third, new technologies are putting power directly into the hands of data owners. Researchers in the Netherlands are working on a system called I Reveal My Attributes that lets people use online services in a way that’s “authenticated but anonymous”. Users collect simple discrete “attributes” about themselves in an app (like “I am over 18”), which can be used to verify them without exchanging more information than is necessary for that transaction.
With the UK government soon asking people to prove their age using digital IDs on porn websites, this technology could be one way to avoid another Ashley Madison-style data breach.
Other initiatives like Blockstack, Sovrin, and the DECODE project (on which Nesta is a partner) are building new foundations for data sharing on the internet, drawing inspiration from decentralised technologies like bitcoin. They enable a system whereby all interactions involving people’s data are fully auditable on a public ledger (though raw data itself remains hidden).
The aim is to enforce higher transparency and accountability over who has access to data and for what purpose.
DECODE is testing how this could allow people to express that their data be used for specific social purposes, such as to inform local policymaking.
The fourth opportunity is in the shifting regulatory landscape. The upcoming General Data Protection Regulation will restructure market incentives by threatening hefty fines for non-compliance with strict new rules. For some, the benefits of doing personalised marketing will start to be outweighed by the risks of getting it wrong.
The regulation also confers general new rights on people for “data portability”. This will make it easier for users to seamlessly move their data between different services, spurring new competition and consumer choice.
None of the above should be seen as a silver bullet. One of the biggest barriers to adoption will be public attitudes.
That said, if the latest data scandal really is just the tip of the iceberg, it could be the first of many high-profile stories that drive fresh demand for more ethical alternatives.