UK should ban facial recognition technology until clear legal framework is set, new report says
The UK should ban all use of facial recognition technology until a clear legal framework has been developed to regulate it, an independent report has said.
A new report from the Ada Lovelace Institute has said there is an “urgent” need for a new framework governing “biometric” technologies – including facial recognition technologies – before they are deployed against the public by either public or private bodies.
The independent report, written by Matthew Ryder QC, said the government should temporarily ban all use of these technologies, until legislation is in place, as it said a legally binding code governing the use of facial recognition tech should be published as soon as possible.
The calls come as critics of facial recognition technology have raised concerns about infringements on privacy and the potential such technologies have to embed systemic biases.
The report explains that the introduction of new laws in 2001, that allowed police to collect and keep biometric data, including fingerprints and DNA, saw the UK build up the largest biometric database in the world.
However, the AI think-tank’s paper says data kept in the UK’s biometric database was disproportionately weighted towards those who had previously had contact with the police – whether they were at fault or not – as it warned that use of biometric technologies risks embedding “systemic flaws” in the policing of certain communities.
In calling for new legislation, the paper argues that while “strong law and regulation is sometimes characterised as hindering advancements,” a clear regulatory framework could in fact boost innovation, by freeing those who work with biometric data “from the unhelpful burden of self-regulation”.
The report says that amongst those interviewed, facial recognition technologies were the primary concern. However, the report also raised concerns about use of other forms of biometric data including health records and behavioural data.
The report also said greater scrutiny should be placed on private use of facial recognition – as in the case of use of facial recognition technology in the King’s Cross development.