Advertisement

NHS gave DeepMind patient records on an ‘inappropriate legal basis’

A top UK government advisor is questioning a deal that granted DeepMind access to 1.6 million patient records.

Bloomberg via Getty Images

A data-sharing deal between DeepMind and London's Royal Free Hospital Trust was struck on an "inappropriate legal basis," a top UK government advisor has said. In April 2016, NewScientist revealed that the company had received 1.6 million patient records to develop an app called "Streams." While there are strict rules regarding patient data and confidentiality, common law states that consent is "implied" if the information is being used for "direct care." Google's AI division used this line of thinking to justify the deal, however Dame Fiona Caldicott, the UK's National Data Guardian, disagrees because the app was still in testing at the time.

If the app was in development, that means it wasn't yet ready for medical use. Testing, Caldicott argues, is different from "direct care," and nullifies the argument that the company had "implied consent" from patients. In a letter to Professor Stephen Powis, medical director at the Royal Free Hospital, Caldicott said: "Given that Streams was going through testing and therefore could not be relied upon for patient care, any role the application might have played in supporting the provision of direct care would have been limited and secondary to the purpose of the data transfer."

The letter, obtained by Sky News, was sent by Caldicott in February. She passed the same conclusions along to the Information Commissioner's Office (ICO), who is currently investigating the deal. In March, the data protection watchdog said it was "close" to publishing its findings, which could dramatically affect how the UK's National Health Service (NHS) works with technology companies like DeepMind in the future. As Bloomberg reports, the ICO has the power to fine the hospital and impose other sanctions should it believe the original data transfer was illegal.

Streams is an "instant alert app" that quickly reviews test results and looks for serious problems, such as acute kidney injury. If anything is found, doctors are alerted immediately so that further diagnosis and, potentially, life-saving treatment can be carried out efficiently. Caldicott does not dispute the app's effectiveness or the use of technology to improve healthcare. The legal basis for mining patient data, however, needs to be done in a "transparent and secure manner," she says, otherwise public trust and the opportunities for further discussion and development will vanish.

NewScientist's original article, sourced from a Freedom of Information (FOI) request, triggered some public outcry from the academic community. One study concluded that DeepMind and the NHS had made "inexcusable" mistakes, and that the ICO's case should be treated as a "cautionary tale" for future technology and healthcare partnerships. The backlash caused the AI company to temporarily suspend Streams' use in hospitals. A new data-sharing agreement was drawn up last November, alongside confirmation that the app had been registered as a medical device with the Medicines and Healthcare Products Regulatory Agency (MHRA). It's now been rolled out widely across Royal Free's hospitals in London.

DeepMind has so far defended its actions. A spokesperson told The Register: "Nurses and doctors have told us that Streams is already speeding up urgent care at the Royal Free and saving hours every day. The data used to provide the app has always been strictly controlled by the Royal Free and has never been used for commercial purposes or combined with Google products, services or ads – and never will be. Clinicians at the Royal Free put patient safety first by testing Streams with the full set of data before using it to treat patients. Safety testing is essential across the NHS, and no hospital would turn a new service live without testing it first."

Health data privacy group medConfidential takes a different view, however: "This response by Google shows that DeepMind has learnt nothing. There may well be lawful reasons for third party IT providers to process data for direct care for 1.6 million patients – unfortunately for Google's AI division, developing an app is not one of them."