Computing giant IBM has called time on its facial recognition business, after saying that it opposes the use of technology for racial profiling and mass surveillance.
IBM chief execuvite Arvind Krishna said the company will no longer sell facial recognition services, as it seeks to “advance racial equality”.
In a letter to members of US Congress, Krishna wrote: “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors for mass surveillance, racial profiling, violations of basic human rights and freedoms.”
The decision comes in the wake of widespread protests both in the US and across the globe, calling for an end to systemic racism, and in particular police brutality against people from ethnic minority backgrounds.
The protests were sparked by the death of George Floyd, an unarmed black man, at the hands of police in Minneapolis last month.
Krishna said that although the artificial intelligence technology which underpins facial recognition tools could “keep citizens safe”, it must be “tested for bias” against certain races.
“We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies.”
It comes just months after a US government study suggested facial recognition algorithms are far less accurate at identifying African-American and Asian faces compared to Caucasian faces.
African-American females were even more likely to be misidentified, the research by The National Institute of Standards and Technology (NIST) found.
Meanwhile last year, Joy Buolamwini, founder of Algorithmic Justice League, testified before Congress about her research for Massachusetts Institute of Technology into the bias she found in facial recognition software.
IBM boss Krishna also called on the US Congress to establish a “federal registry of police misconduct” to hold police officers to account.
In London, the Met Police is now using its controversial live facial recognition technology following 10 trials across the capital.
This comes despite an independent report into the force’s use of the technology, commissioned by the Met itself, which found the technology was actually 81 per cent inaccurate.