Last week, we were greeted by the disturbing news that DeepMind, an artificial intelligence subsidiary owned by Google’ s parent company Alphabet, is moving some of its healthcare activities into the newly formed Google Health.
This isn’t the first time that one of the tech giants has been accused of making misrepresentations to consumers, using sleight of hand and semantics to break its promises.
When Facebook acquired WhatsApp in 2014, it asserted that it would not connect the product with its core Facebook experience and that data would not be shared. But it quickly went back on this commitment, breaching the trust of users.
Now, Google’s DeepMind, a company that was given “legally inappropriate” access to 1.6m NHS patient records, according to a senior data protection adviser to the NHS, has breached our trust.
DeepMind has stressed that its current contracts, and the data they process, cannot move over to Google without the prior consent of its partner hospitals, who are the only ones who can decide how data is processed. But critics allege that the company has broken promises it made to patients.
The use of tech and big data in healthcare is a thorny issue. As a society, we want to support positive initiatives, such as the work that DeepMind has completed to improve patient care and help detect acute kidney injuries. The NHS is cash-strapped and looking to improve efficiency wherever it can, and tech plays a crucial role in this.
This has been echoed by health secretary Matt Hancock, who is set to invest more money into healthcare innovation and technology. And with a growing number of innovative “healthtech” startups, there are definitely opportunities for collaboration with the private sector.
However, when it comes to something as sensitive as our health, privacy is paramount, and effective patient safeguards are crucial.
When DeepMind was acquired by Google in 2014, the founders did try to put protections in place, such as appointing an independent reviewer panel in 2016 to scrutinise the company’s work in healthcare. It seems that they truly did intend to be an ethically sound business.
However, the recent absorption of DeepMind into Google Health has revealed that good intentions are not enough. Google has now said that this panel is not fit for purpose if DeepMind is to grow globally, and is disbanding it.
My suspicion would be that, amid mounting losses in the DeepMind division, there has been pressure to create economic returns for Alphabet. Alphabet’s profits ultimately win out over the good intentions of the DeepMind team, largely at the expense of the end user, as it is our data at risk.
We will most likely never know exactly what happened, but given the revelations about murky business practices and questionable attitudes towards user data at the big tech firms, you’d be justified in suspecting that, perhaps, somewhere deep inside Google, there might have been a team of people who had always planned for this to happen.
Positioning the DeepMind brand as a quasi-research centre gave the subsidiary a chance to develop market opportunities and get access to partnerships and data that Google itself could not access directly, due to public opinion about its ethics.
And even if this wasn’t a deliberate strategy, the result is the same: DeepMind has broken its pledge that “data will never be connected to Google accounts or services”.
I still believe that it is possible for the Silicon Valley giants to do good, even in sensitive areas like healthcare. But somewhere along the road, these businesses have lost their way. We are placing ever more trust in them, and it is disappointing to see our trust broken time and again. It needs to end.
As business leaders in the tech world, we should pledge that 2019 will be the year of making society the number one priority, even at the cost of growth in profitability. Do this, and these Silicon Valley giants can become an enduring force for good.
But if they keep taking advantage of their power, the future will be very different for these tech powerhouses. Consumers will not tolerate being lied to over and over again.