The UK’s data protection watchdog has warned against businesses using “immature” technology to measure biometrics, saying the risks currently outweigh the benefits.
Emotional analysis technologies process data such as gaze tracking, facial movements, heartbeats, facial expressions and skin moisture.
Financial services firms often use this technology in their facial recognition practices to verify identities through comparing photo IDs and a selfie.
Some airports, too, use the tech to streamline passenger journeys through facial recognition at check-in, self-service bag drops and boarding gates.
“Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever,” deputy commissioner of the Information Commissioner’s Office (ICO), Stephen Bonner, said today.
“While there are opportunities present, the risks are currently greater.
“We are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.”
The ICO is due to publish fresh guidance on the use of biometric technology in Spring 2023.
Organisations that fail to act responsibly or pose risks to vulnerable people will be investigated, the watchdog said.
Last week, the government’s Science and Technology Committee launched an inquiry into artificial intelligence (AI) in a bid to regulate it as the tech becomes an increasingly appealing avenue for businesses.
“AI is already transforming almost every area of research and business. It has extraordinary potential but there are concerns about how the existing regulatory system is suited to a world of AI,” chair of the Science and Technology Committee, Greg Clark, said at the time.
“With machines making more and more decisions that impact people’s lives, it is crucial we have effective regulation in place. In our inquiry we look forward to examining the government’s proposals in detail.”
The Committee is expected to publish its proposals on how to tackle AI regulation in a White Paper later this year.