The UK data watchdog says Google DeepMind's NHS trial broke the law on privacy of patient records

 
Lynsey Barber
Follow Lynsey
SKOREA-SCIENCE-COMPUTERS-AI
DeepMind has become more transparent since alarm over patient data (Source: Getty)

A technology trial between DeepMind, one of the country's top artificial intelligence companies, and a London NHS trust broke the law the UK's data watchdog has said.

The Royal Free NHS Trust failed to ensure the privacy of patient records when it agreed a deal to share data with the Google owned tech company in a trial of an app to diagnose kidney disease and alert doctors to risks.

A year-long investigation by the Information Commissioner's Office (ICO), found shortcomings in how the data of 1.6m patients was handled, ruling that the agreement broke data protection rules.

Read more: 5 things we learned about DeepMind's Demis Hassabis on Desert Island Discs

“There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights," said information commissioner Elizabeth Denham, who went to lengths to explain it is not anti-innovation.

“Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening.

“We’ve asked the Trust to commit to making changes that will address those shortcomings, and their co-operation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.”

In blog post she acknowledged the positive outcome of the trial and that work could continue while still complying with the law.

The undertakings as a result will see the hospital trust establish a proper legal basis for the trial with DeepMind and any further projects, ensure it is transparent in such efforts and audits of such tests that will be shared with the ICO.

Read more: How DeepMind's laying foundations for artificial intelligence in the NHS

DeepMind faced criticism over transparency over the previously secretive project with the the Royal Free and the deal was slammed by the National Data Guardian Dame Fiona Caldicott. Her assessment was passed on to the ICO investigation.

The tech firm has since inked several new deals to test its technology with NHS trusts but has been more public in its work since the storm around the original agreement, admitting that it "could have done better".

It has unveiled a project using a "blockchain-like" technology to keep data both secure and its use transparent. It has also appointed a heavyweight team of tech and health experts to independently review its work.

DeepMind said in a blog post: "We welcome the ICO’s thoughtful resolution of this case, which we hope will guarantee the ongoing safe and legal handling of patient data for Streams."

"Although today’s findings are about the Royal Free, we need to reflect on our own actions too. In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health. We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong, and we need to do better. Since then, we’ve worked hard on some major improvements to our transparency, oversight and engagement."

Related articles