If the NHS learned anything following the Care.data controversy and, more recently, the ransomware attack that hit trusts up and down the country, it’s that Information Governance matters. People care about protecting their data, and it has been shown it can be extremely valuable to others. So the framework in place to protect it is not just paperwork peppered with complicated legalese, it’s crucial to ensuring that everyone’s confidential health and care data is kept safe and secure.
A recent BBC News article I contributed to discussing the widespread use of WhatsApp in NHS , as revealed by doctors, sparked a huge conversation on social media about what being compliant with IG standards actually means.
Some were asking: If encryption of consumer messaging systems is as robust as we’re led to believe, why do we need to worry?
The answer is that using such consumer focused tech to have clinical discussions is often illegal as it is could potentially breach the UK Data Protection Act, not to mention the NHS Act 2006, the Health and Social Care act 2012 and the Human Rights Act.
Although health and care workers in direct contact with patients are allowed to discuss named patients amongst themselves, confidential data must not be shared elsewhere, (including, for example, being stored on computers in the US that aren’t compliant with NHS standards) unless the data is anonymised or the patient has given their permission for this to happen.
NHS England’s Information Governance is designed to cover all applicable UK law. That means any technology that is developed for use by UK health and care workers must comply with this guidance to be legal and, equally importantly, reach the right standards.
What if I use non-compliant technology?
In simple terms, if the data is not effectively anonymised, then there may already have been a breach.
For example, sharing an image of a medical test result with a colleague using WhatsApp is likely to be in breach of UK law if the patient is identifiable (e.g. their name is visible) and the patient has not given their explicit consent to have their data stored in the US.
And even if clinicians and other health and care workers believe they have ensured the info shared doesn’t identify patients, they could still be in breach by using initials or information which could be used alongside other sources to identify someone.
Referring to the Information Commissioner’s Office (ICO), the General Medical Council (GMC) Guidance on Anonymised Information
“considers data to be anonymised if it does not itself identify any individual, and if it is unlikely to allow any individual to be identified through its combination with other data. In other words, simply removing the patient’s name, age, address or other personal identifiers is unlikely to be enough to anonymise information to this standard.”
The GMC also clarifies that the level of anonymisation required can vary by the type of data, the size of the dataset, the method of sharing and other factors.
So although we know that most health and care workers are trying to anonymise their patient’s information before sharing it through WhatsApp, a simple combination of initials, date of birth and GP surgery may be enough for the data to be ineffectively anonymised according to the ICO and in breach of UK data protection law.
What’s more, using WhatsApp might also be in breach of Principle 8 of the Data Protection Act because the USA does not provide adequate protection of personal data in the eyes of UK law, unless the recipient of the data has certified compliance with the PrivacyShield framework. Whatsapp is owned by WhatsApp Inc, which does not appear on the Privacy Shield list. WhatsApp’s parent company, Facebook Inc, is certified on Privacy Shield, but our understanding is that US subsidiaries are required to have their own certificate or be specifically named on their parent’s registration.
Just a matter of time
I’m not aware of any cases against health and care workers in the UK for using WhatsApp or other non-compliant technologies, but we believe it is unfortunately only a matter of time before someone makes an innocent mistake and the use of non-compliant technology is in the headlines once again.
That’s why we need an urgent shift to technology like medCrowd, which is compliant with NHS and UK Data Protection requirements. Only then will health and care workers feel safe in the knowledge that they can share confidential patient information without accidentally breaking the law.
— This feed and its contents are the property of The Huffington Post UK, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.