Posted May 11, 2018 by admin in articles

Patient Privacy or Better Clinical Outcomes: Will AI in Health Force us to Decide between them?


As our understanding of the human body has evolved, we’ve become increasingly reliant on technology to support us in the diagnosis and treatment of health problems and disease. More recently, as leaps have been made in the application of machine learning and artificial intelligence, novel applications of the technology have led to some initially exciting, even transformative results.

Some of the more talked about innovations rest on the predictive diagnosis of common illnesses to allow for preventative treatment, rather than cure. In these examples, it seems that we have the opportunity to save lives using less invasive and severe methods.

Developed by the University of Nottingham, an AI-based computer program was able to accurately predict 7.6% more heart attacks than doctors, where Doctors used standard methods. At the same time, the software was also able to tag those patients who could safely forgo common medications prescribed by doctors where their risk was deemed to be lower.

The key to the success of the program seems to be in its ability to process more data, to examine more patient data types, and to be able to see trends over a period. This allows it to do things that doctors in isolation may struggle to do, such us understanding interactions between medications en masse.

Writing to NBC news by email, Dr Stephen Weng, a research fellow at the University of Nottingham said: “it would be easy for doctors to use the algorithm in their practices. Once that happens, he said, doctors could quickly generate a list of at-risk patients — and arrange appropriate drug and behavioural interventions with enough time to stave off heart trouble.”

In a similar application of machine learning, Google’s Deepmind programme began to form partnerships with NHS trusts in a bid to help understand how innovative uses of patient data could be used to cut costs and improve overall public health and patient care. One initiative borne from this partnership is ‘Streams’. Streams is a mobile phone app that allows medical staff to review data on their patients such as heart rate, blood pressure and vital signs, while also allowing them to record their observations.

By providing remote self-serve access to this data, as well as alerts, it’s believed that this will address what’s known as ‘failure to rescue’, where medical staff can’t reach the right patient in time – amounting to preventable deaths where the warning signs are not picked up on time.

So far, feedback has been positive where ‘nurses said that it was saving them up to two hours each day.’

At a time where NHS spending is still growing, but patient care standards are felt to be declining, innovations like this may hold the key to improved health outcomes within the meagre constraints of the budgets that trusts have available. But innovative partnerships with private companies aren’t without their risks.

Indeed, the Information Commissioner ruled that the Google DeepMind and NHS partnership illegally used patient data after an investigation conducted by New Scientist published in 2016 showed that Google had access to 1.6 million identifiable NHS patient records.

Elizabeth Denham, the information commissioner, said:

There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights. Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening.”

Back in 2017, you might remember the ‘WannaCry’ cyber attack which hit at least 16 UK hospitals and surgeries forcing medical staff to revert to ‘pen and paper’ after they were locked out of their machines. More recently the Cambridge Analytica scandal affecting Facebook and its users have shone a renewed light on data privacy. It might be fair to say that cybersecurity and data privacy are at the forefront of public thinking, and rightly the public and patients alike are concerned about the use and sharing of their data.

It begs the question; do we need to sacrifice patient confidentiality and data security to save a crippled NHS?

Some might say it’s a worthy trade-off. Others may prefer to keep their data under lock and key even if it can save the lives of others. You might also wonder whether success in marrying data-based innovation with healthcare records needs to come at a confidentiality cost at all, perhaps there’s another way?

Ron Chrisley, director of the Centre for Cognitive Science at the University of Sussex, told Verdict:

Information is hard to track. Once it gets out of a certain space, say leaves the NHS, it’s hard to tell where it might end up. Our laws need to catch up with these kinds of technologies. We need to re-think about what the possible reach of these things might be.”

While a post written by Deepmind’s co-founder Mustafa Suleyman and DeepMind’s clinical lead Dominic King apologised saying:

Ultimately, if we want to build technology to support a vital social institution like the NHS, then we have to make sure we serve society’s priorities and not outrun them. There’s a fine line between finding exciting new ways to improve care, and moving ahead of patients’ expectations. We know that we fell short at this when our work in health began, and we’ll keep listening and learning about how to get better at this.”

It’s felt that more can be done. Anonymising data could be a reasonable first step, but it’s hard to know whether that would detract from the clinical benefits sought without a deeper understanding of how these systems work and arrive at their conclusions.

But what of those patients who have already been affected by preventable accidents, illness and injury in NHS trusts? While their data may help to prevent tragedy befalling future patients, it does little to help them personally. With NHS trust performance falling sharply, despite increased investment, it can be hard to know where to turn. When the wrong diagnosis is provided, or an oversight in patient care leads to injury, there are no technological solutions to the problem, at present, it appears that the only ‘remedy’ is to seek legal redress by means of compensation; you can find out more on this by visiting this website.

For now, great strides are being made that stand to benefit us all, but if this technology is to advance at pace, data processors and controllers need to restore public faith by improving patient record management and transparency if they’re to get the buy-in they need to continue their work. It’d be nice to have the best of both worlds.

Views Count:1,023 views
  • Join Our Newsletter

    Signup today for free and be the first to get notified on News updates.