New Financial Conduct Authority (FCA) chair Charles Randell warns of danger of big data ‘algogracy’
The new chair of the Financial Conduct Authority (FCA) Charles Randell today warned of the danger of a new “algocracy” as new technology threatens to harm consumers of financial services.
Randell suggested that the concurrent rise of “big data”, artificial intelligence and machine learning, along with behavioural science insights, could require a new regulatory framework.
New technologies “call into question the adequacy of the traditional liberal approach to the relationship between financial services firms and their customers,” he said in his first public speech since taking up the role in London. “And regulation is central because it will help define whether AI and big data liberate customers, or disenfranchise them.”
Giant tech firms such as Google and Facebook have built up vast data sets and the ability to mine them in highly profitable ways, but have also faced a string of scandals about how they use their newfound power. Financial firms which the FCA regulates, while further behind, have started to rely on algorithms rather than human judgement in many decisions in ways which may prove harmful, Randell said.
“Society in general and policy makers in particular need to think about how to mitigate the risk that an algocracy exacerbates social exclusion and worsens access to financial services in the way that it identifies the most profitable or the most risky customers,” he said.
Randell pointed to the examples of price comparison websites allegedly quoting higher car insurance premiums for people with names suggesting they may be members of ethnic minorities and the practice of firms hiking prices for customers less likely to shop around as early instances of potential abuse.
New technologies have the potential to be “used against our interests”, he said.
Meanwhile, the warned that the assumption of the “liberal approach to markets” that consumers can make good choices if given fair disclosure is being “called into question”, particularly where vulnerable people are involved who may not have the capability of making informed choices.
Firms must adopt “short and readable” statements on data, rather than “pages and pages of obscure disclosures, disclaimers and consents”, he said.