Trust Me, I’m A Doctor: Dealing With Data In The New Digital Health Age

As recently as 200 years ago, people still thought that bloodletting was the only cure for diseases. Not thousands of years ago; just two centuries. It’s astounding to think of how far we’ve come since then. A man who became a hero in the world of medicine is French physician Pierre Louis. He’s credited as being the father of the clinical trial. You could even argue that he was the first to use data in medicine. But his acceptance didn’t come immediately. It took a long time to shift paradigms that had been in place for generations.

We’re in an era now where similar to Pierre Louis, data could save the day once again. And this time, it’s bigger than ever. But are we all ready to adopt new practices?

The premise of the healthcare industry has been the same for most of its existence. You have a problem and the provider’s job is to find a solution – a remedy. It’s medicine that is only as good as the information doctors have available to them. The goal is to obtain a wider scope of information. Sure, they can take the lead from previous illnesses, lifestyle choices and family health records. But on the whole, the industry is on the hunt for ways to treat us and improve our wellbeing based on proactive intelligence.

Up stepped big data. Everything changed – or at least, it is in the process of changing. See, the place we are in now is exhilarating. We’re able to use data to our advantage to make healthcare better, treatments more effective and diseases easier to diagnose. Machines can work for us, giving doctors the tools that were previously considered impossible. But that doesn’t mean that all is rosy in the industry. As effective as the technology is, society needs to adjust to such a seismic shift.

New approach

The breakthroughs powering this new wave of Digital Health innovation are promising. By improving our ability to monitor, measure and record symptoms, technology enables doctors to collate data on human health on an unprecedented level and scale. Advances in artificial intelligence (AI) can make sense of this unstructured data, from test results, scans, consultation notes and appointment details, by predicting the risk of disease and providing treatment recommendations.

A recent study by Nottingham University compared the use of medical guidelines with machine-learning algorithms to analyse nearly 400,000 UK patient records in order to come up with predictive tools for detecting the risk of heart attacks without any human instruction. The algorithms performed better than the guidelines and correctly predicted 7.6% more potential cases.

IBM Watson‘s ability to find statistical correlations and trends among cancer patient data sets, provides clinicians with evidence based treatment options for patients. Meanwhile, we’re also seeing some clever solutions being developed by the likes of Google Deepmind, Babylon Health and BenevolentAI. They’re taking AI concepts and using them to transform the sector. Many of the leading thinking is coming from interesting start-ups, which is great to see.

Bridging the gap

But trust in these innovations is an issue that we can’t ignore. Among the stories of life-changing successes, there’s also the data breaches, hoaxes and fake news. The latter overshadows the former and could prove damaging in the long run as the industry wants to continue its growth.

For example, there’s the story of Kenyan smartphone users who were misdiagnosed with HIV by a hoax mobile app. Closer to home, there’s news that NHS hospital trusts in England reported 55 cyber-attacks last year. And few could avoid the news about the WannaCry ransomware attack that affected hospitals across the country and around the world.

All this contributes to a trust shortfall. Like many industries, healthcare is already reassessing its practices. The changes to be introduced as part of the General Data Protection Regulation (GDPR) will cause businesses to be vigilant with the data they use. From 2018, companies will receive heavy fines for data breaches. There will be no margin for error.

Call for change

How is the data being used? Why do tech companies want so much of it? And what does that mean for privacy? These are the questions being asked and the way the industry answers them will be vital for how healthcare evolves. Interestingly, the people asking these questions aren’t necessarily the patients. Anyone who is sick would welcome an outcome where a human doctor is aided by data, as opposed to a human working without it. And besides, treatments driven by data often wouldn’t look much different to patients at the point of delivery.

Be that as it may, there’s still a need for health providers to be transparent. As things stand, the Edelman 2017 Trust Barometer finds that trust in the healthcare industry isn’t as high as it should be. Trust issues do exist then, and it’s the broader liberal societal demand that will be the loudest voices to call for improvements to be made. In order to address this challenge, clear, evidence based communication combined with real life examples that reinforce the benefits will be critical.

Trust must be earned. We can’t bulldoze our way into surgeries. It involves a two-way conversation where companies are cognisant of the concerns that the public has and addresses them.

It’s also important to recognise that many of the processes we’re discussing involve behaviour changes for both practitioners and the public. For the public in particular, we can’t expect them to become accustomed to it all overnight. The NHS and wider healthcare and technology organisations need to focus on working together to gradually change behaviour.

Way forward

To continue in the spirit of Pierre Louis, we have to navigate a number of barriers that hinder the adoption of modern medicine. These include economics; who will pay for it and how? Then there’s skills. It’s one thing to have the technology and insight but do we have the right amount of adequately skilled people to roll it out nationwide? But trust will be front and centre of the march forward. Questions will be raised about what happens if data is stolen, or even the integrity of the person or computer who trained the AI.

Gaining trust is a journey. And it’s one that we need to embark on with purpose and immediacy. At a time when there are so many ready-made influences that can tarnish reputations at a moment’s notice, the approach we take now will be far-reaching.

— This feed and its contents are the property of The Huffington Post UK, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.