Slave to the algo: facial recognition and fraud in a Covid-19 world
“There will come a time when it isn’t, ‘they’re spying on me through my phone’ anymore. Eventually, it will be, ‘my phone is spying on me’.” American author, Philip K. Dick, warned us about the rapid advancement of technology.
Incredibly, this warning was given long before his death in 1982, when mobile phones were the size of bricks and computers were still shrouded in mysticism. Dick’s prophetic novels have since graced the big screen – Do Androids Dream of Electric Sheep became Blade Runner, and Tom Cruise showed us the value of a black market in ‘working/reusable eyes’ as the lead in the adaptation of Minority Report. Both of these works depict a world where technology can track our every movement, eliminating privacy as we know it. Sound familiar?
Our heightened reliance on technology during the pandemic has rapidly augmented its encroachment.
At the beginning of February, Apple announced the launch of its new iOS 14.5 update, which will allow the Face ID unlock feature on iPhones to operate whilst wearing a mask. The caveat, however, is that the user also has to wear an Apple Watch at the same time.
Read more: What does the WhatsApp saga tell us about privacy in 2021?
An Apple Watch can sense if it is still being worn, monitoring the wearer’s pulse through in-built heart rate monitors. Once unlocked and worn by the owner, this will correspond with the mask ID feature on the user’s iPhone. The Bluetooth signal between the phone and watch will also be measured to check that they are being held by the same user.
Though arguably helpful, this new feature requires individuals to be further tied to their tech – not only allowing access to facial mapping software, but willingly offering records of their heart rates too.
With each new collection of this most personal form of data, we open ourselves up to exploitation – a dangerous move when we currently have little to no legal framework to modulate these advancements.
This is the Dawning of the Age of the Digital
Facial recognition is an invaluable invention. However, the technology has long been beleaguered by accuracy concerns. The Centre for Strategic & International Studies suggest that high levels of accuracy can only be obtained in settings where there is good lighting and faces are unobscured. So, what happens if a subject is wearing glasses, walking down a dimly lit street, in a crowd? These were the issues raised in the UK’s first test-case on the matter, Bridges v CCSWP, where the Court of Appeal ruled in August 2020 that the use of Automatic Facial Recognition (AFR) by South Wales Police was unlawful, in part due to a lack of reasonable safeguards. However, what constitutes ‘reasonable’ is yet to be discussed.
Moreover, this technology is not infallible to fraud. The systems deployed by face-unlock features are used to open devices containing passwords, bank details and the like. Naturally, fraudsters will be lurking. A 2018 Forbes study demonstrated how the Samsung Galaxy S9 and other products with face-unlock could be hacked using a 3D printout of the user’s head. With the right tools, criminals may now take identity theft to new heights, turning your own face against you.
AI in the Sky
How then can we harness the good, whilst mitigating the bad? Biometric data is a new class of information that requires a methodical approach when it comes to its collection, storage and disposal. At present, we have regulations that deal with the collection and retention of DNA and fingerprints, but nothing to the same standard in relation to facial recognition samples.
While the technology broadly falls under the Data Protection Act 2018, the Protection of Freedoms Act 2012 and article 8 of the Human Rights Act 1998, there is no single instrument that looks at facial recognition in detail. Rather, we have a mismatch of statutory instruments that cannot compete with this technological sophistication. We therefore believe that a bespoke legal instrument, which accounts for the complexity of this technology, must be created.
Without adequate safeguards in place, what is to stop the current free-for-all system of biometric data collection from being misused by fraudsters and for unauthorised mass-surveillance, inching our society evermore into the realm of a Philip K. Dick plot.
Perhaps the next Matt Hancock COVID-policy will implement the use of hybrid-watches linked to our facemasks – compulsory attire for our one hour of mandated outdoor exercise?
Read more: DEBATE: Should we be worried about a tech bubble?