This article is more than 1 year old

Audit of DeepMind deal with NHS trust: It checks out, nothing to see here

Law firm's conclusions at odds with UK data watchdog, and critics don't feel any better

An audit of the Royal Free NHS Trust and Google DeepMind's controversial app to detect kidney disease has deemed its current use of confidential data from real patients lawful – going so far as to suggest findings from other watchdogs were misplaced.

The audit of the Streams app – which uses a fixed algorithm to help detect patients with acute kidney injury (AKI) – was ordered by the Information Commissioner's Office after widespread concerns about the scale and scope of the project.

The app, developed by the health arm of Google's AI biz and put into use at the London trust in 2015, sucked up the confidential medical records of the 1.6 million patients who had attended the Royal Free in the previous five years. The data stored by the app now reaches back eight years for some people.

The Trust's decision to share the records with the firm came under fire from patient groups, who were concerned it breached confidentiality and data protection laws.

Investigations by the National Data Guardian, Fiona Caldicott, and the ICO found that the legal basis the Royal Free and DeepMind had used to justify the sharing of confidential data – implied consent – was not appropriate.

The ICO said that patients would not have "reasonably expected" their entire patient record to be shared with DeepMind for the testing of a new mobile app for AKI, as this was not for their direct care, and that the Trust had failed to adequately inform people how their data would be used.

As a result, the Trust commissioned Linklaters LLP to conduct an audit of its use of Streams, the findings of which were published late yesterday (PDF).

The scope, though, is limited just to the current use of Streams and, as emphasised repeatedly in the report, does not include a historical assessment of the app, which means the findings will be a frustration for critics who want to see the initial data gathering put under the microscope.

Disagreement on duty of confidence

In a ruling that will be seen by the Royal Free and DeepMind as vindication of their work, the law firm ruled that the use of confidential records for the operation of Streams is lawful under data protection and confidentiality rules.

The latter decision – saying the hospital had not breached its duty of confidence – is in direct conflict with those of the ICO and NDG. Linklaters noted this in the report, but argued that – unlike data protection law – confidence laws are mostly based on case law.

The auditor said that the duty of confidence for health professionals arises in equity – meaning that the question is whether use of the data for the app would trouble a health professional's conscience.

It added that it was not clear if implied consent should be restricted to direct care, and that it would be hard to apply a test based on a patient's reasonable expectations to such a large group.

However, the ICO has indicated this debate is not over, saying in a statement that it has reserved its position in relation to medical confidentiality. "We are seeking legal advice on this issue and may require further action," said deputy commissioner Steve Wood.

Docs can get this info anyway, so...

The auditor also found that the use of real patient records was lawful for testing the app – as long as using that information is "genuinely necessary" and steps are taken to minimise information and apply the right controls.

Underlying this is the question of the scale of the data-sharing between the three hospital sites in the trust and DeepMind and – more fundamentally – whether any information needs to be stored on Streams at all; but the auditor tended to accept the current state of affairs.

For instance, the report said that Streams has to use real patient data in order to ensure its safe operation, and that side-by-side testing of the app may require comparison with real, full-scale data sets.

The auditor did note, though, that any future privacy impact assessments need to set out clear justification for, among other things, the use of non-synthetic information as well as the volume of data being used.

Another concern is the lack of a formal retention period for the data stored by the app: although only data from the previous 12 months is needed to generate an AKI alert on the app, Streams now contains information up to eight years old.

The auditors said clinicians told them this was useful for context, and pointed out that doctors could access "much older information directly from those systems regardless of any retention period applicable to Streams".

A similar argument – that the hospital's systems hold all this information already, so this is just duplication – is made throughout the report.

But this position is given short shrift by opponents.

"It's still clinical care through a mass surveillance lens," said Eerke Boiten, professor of cybersecurity at De Montfort University. "They need data (now grown to eight years' worth) on all potential patients 'just in case' – even though they admit 'the AKI event might only occur in the future or not occur at all'.

"This is justified by drawing an analogy with the hospital's regular data systems: they hold all the data on all past patients, so why shouldn't Streams too?"

'Insurmountable technical barriers'

A more minimal approach would be for Streams to query the Royal Free's systems as and when required, and then purge that information after a set time – but the audit dismisses this as a technical impossibility.

Despite acknowledging that it isn't in a "position to determine if the technical barriers to move to a query-based model are insurmountable", the law firm appears to take the Trust – which said its systems aren't up to the task - at its word.

For instance, the report said, any such query-based system would need to use the Royal Free's Open Database Connectivity connection to the central database, which doesn't have the capability to deal with the volume of queries Streams would generate; and the app wouldn't work if one of the 120 systems over which patient info is spread was down.

But, as Boiten noted, integrating data across the Trust would have applications ranging far beyond the detection of acute kidney infection. "It sounds like the Royal Free could do with an integrated electronic patient record system," he told The Reg.

Elsewhere, the auditor also recommended that a memorandum of understanding drawn up between the Royal Free and DeepMind in January 2016 relating to a proposed AI project using depersonalised data be scrapped. This is because the pair have given up on the research, and the MOU has "very limited relevance to Streams".

The Register asked the hospital if it was planning to review or scrap this MOU – but it has yet to respond.

What's Google got to do with it?

However, beyond the ins and outs of the deal, there is a more fundamental issue for many critics – that of Google's involvement in health data – is also shrugged off by the auditors, who make much of DeepMind's data protection training and practices, and say it shouldn't be any differently treated than other companies.

"In conducting our review, we considered if we ought to treat DeepMind differently from the Royal Free's other information technology partners, such as Cerner," the report said.

"We decided that this is would not be appropriate. DeepMind acts only as the Royal Free's data processor... Given this limited mandate, we do not see why the Royal Free's engagement with DeepMind should be any different from its use of other technology partners."

But Boiten disagreed. "The motivation for the world's top AI company to act as a software house is still essentially missing – unless, of course, its parent company wanted to establish its usual monopoly in the NHS data market," he said.

Sam Smith, coordinator of MedConfidential, echoed this argument. The hospital might say the vast amounts of data collected are necessary for "vital interests" of patients, he said, but: "The only 'vital interest' protected here is Google's, and its desire to hoard medical records it was told were unlawfully collected.

"The vital interests of a hypothetical patient are not vital interests of an actual data subject." ®

More about

TIP US OFF

Send us news


Other stories you might like