Since the 1950s, researchers have been working towards the holy grail of artificial intelligence (AI).
Facial recognition systems, powered by these developments in AI, are spreading into society. This is hailed as progress – we can now use facial recognition to unlock our phones, automatically tag friends in Facebook photos, and use unmanned biometric passport gates at airports.
However, in the rush to adopt these technologies, we must be conscious of their effect on surveillance and their potential to upset the delicate balance between security and privacy.
Facial recognition can have dramatic effects on police efficiency – an investigating officer can simply upload an image and run it through databases in a matter of minutes, significantly reducing the time needed to identify a suspect.
Similarly, security services at airports can automatically flag up those on no-fly lists. To do that, they need to scan the majority of passengers – they may even have scanned you. The US Department of Homeland Security has said that up to 97 per cent of outbound airline passengers will have their faces scanned and scrutinised by 2023.
Even if you don’t fly, your face may have been scanned. Earlier this year, the Metropolitan Police ran a series of trials in public spaces across London. Passers-by had their images streamed directly to the Live Facial Recognition system, which creates a digital version of a detected face and searches it against the list of those wanted by police. Where it makes a match, it sends an alert to an officer in the area, who compares the images and decides whether to stop the person.
Despite high hopes, only eight out of the 42 matches made by the system turned out to be correct – worrying, if it is to be the basis for police strategy.
The technology will doubtless improve, and polling shows that most Londoners already accept that surveillance is a necessary trade-off for assisting police and security services to catch serious criminals. But we are lucky to live in a liberal democracy with high levels of oversight of the state’s actions. Millions are not so fortunate.
Under an authoritarian government like China’s, this kind of digital monitoring serves to increase the reach of a repugnant police state. No one can have missed the coverage of pro-democracy demonstrations in Hong Kong, where fear of the cameras that follow their every move forced protesters to get creative – wearing face masks, dazzling cameras with lasers, and even throwing cardboard boxes over the top of camera poles.
Less well publicised has been the plight of those living in the western region of Xinjiang, where facial recognition technology has helped the police to exert near-totalitarian control over the population to terrorise the Muslim Uighur minority.
Each person identified as Uighur has an ID card with reams of associated data, including family connections, biometric information, and the chillingly-named “reliability status”. Every Uighur with a mobile phone must run spyware on their device, provided to them by the government. And to join the data dots with chilling efficiency, urban areas have CCTV cameras every few metres, recording passing drivers’ faces and number plates.
All of this data is fed directly into an AI-powered system that generates lists of Uighur suspects to be detained in “re-education camps”. Over a million people are estimated to have been held in one of these sites, without access to legal representation, and some escapees have spoken out about torture and starvation.
It would be a mistake to think that this is a problem for China, and China alone. Elements of this approach to surveillance will doubtless inspire other autocratic governments, and China will be happy to sell them the necessary technology.
We in the UK need to have a much more robust conversation about our own privacy. But we also have an important obligation to hold accountable governments whose people do not have that freedom.
That starts with understanding what this technology can do – and, if we don’t work out how to mitigate the risks it poses, the damage it could do to our society.
Main image credit: Getty