FCA hands Palantir sensitive data in AI crime push
Britain’s financial watchdog is turning to AI to hunt down fraud, raising security questions over who gets to see and use sensitive data.
The Financial Conduct Authority (FCA) has awarded Palantir a three-month contract to analyse its internal intelligence systems which notably contain highly sensitive case files, in a bid to sharpen fraud detection.
The deal, worth over £30,000 a week, will see the US group apply its Foundry platform to the FCA’s vast “data lake”, the Guardian reported, spanning everything from suspicious activity reports to consumer complaints or internal investigations.
Officials have said the trial could pave the way for a wider rollout of AI across the regulator, which oversees around 42,000 firms from high street banks to crypto exchanges.
The move reflects a broader shift in Whitehall, where departments are increasingly turning to data-driven tools to sift through growing volumes of digital information while under pressure to do more with fewer resources.
But the latest contract also raises familiar questions about how far the UK is willing to rely on overseas tech firms to handle sensitive public data.
Privacy concerns resurface as Palantir footprint grows
Palantir’s reach across the British state has expanded steadily in recent years, with contracts spanning the NHS, policing and defence.
The company now holds more than £500m in UK public sector deals, following a £330m NHS data platform agreement and a £240m Ministry of Defence (MoD) contract.
Its latest foothold in financial regulation takes it into the heart of the City, giving it visibility over one of the UK’s most economically important sectors.
That has prompted unease both inside and outside the regulator due to the breadth of the data involved, which is understood to include emails, call recordings, social media monitoring and reports of suspected criminal activity.
One source familiar with the FCA’s work questioned how much insight the company would gain into enforcement methods, raising concerns about how that knowledge might be used beyond the contract.
Christopher Houssemayne du Boulay, a partner at Hickman & Rose, warned: “We could be talking about hundreds of whole email accounts and full financial records… If you ingest that data and use it to train an AI system, there are very significant privacy concerns.”
The FCA insists safeguards are in place. It said Palantir will act strictly as a “data processor”, meaning it can only operate on the regulator’s instructions, with data stored in the UK and encryption keys retained by the watchdog.
The company will also be required to delete the data at the end of the trial, while any intellectual property generated from the analysis will remain with the FCA.
Even so, the decision to use real data, rather than synthetic datasets often recommended for testing, has raised eyebrows, particularly given the sensitivity of the material involved.
On the flip side, the regulator has little choice but to modernise. Financial crime remains one of the UK’s largest categories of offending, and experts have long warned that enforcement agencies are under-using the data already at their disposal.
Professor Michael Levi, a specialist in financial crime at Cardiff University, said there had been “serious under-exploitation” of regulatory data, and that AI could offer a meaningful step change in detection capabilities.