Using Big Data To Fight Dementia

Technology is revolutionising our understanding and treatment of disease. Medical professionals and patients are all producing huge amounts of data on a daily basis from a vast array of sources including electronic health records, genomic sequencing, high-resolution medical imaging, sensing devices and smart phone applications that can monitor patient health. This could prove particularly valuable when trying to increase our understanding of neurodegenerative diseases like dementia.

Nearly 44 million people worldwide now suffer from dementia – a number that is expected to spike to 115 million by 2050. The crisis is not particular to the rich world – nearly 60% of the burden falls on low and middle income nations. The cost of care is immense. According to Alzheimer’s Disease International, it amounts to $604 billion annually – nearly 1% of global GDP. As a result, in April 2012, the World Health Organisation identified dementia as a “public health priority” and the G7 declared a global “fightback” against the disease.

Nevertheless, whilst billions are spent every year on research for a cure, progress has been slow. No new drug has come to market in over a decade. More than a hundred years after the disease was first classified by Dr Alois Alzheimer, the molecular basis of the disease is still unknown and none of the drugs in the market address the disease’s underlying pathology.

Big data has the potential to change all that. It is already well established that dementia is a disease that develops as a result of the interaction of genetic, non-genetic and environmental factors. Large quantities of the necessary behavioural, genetic, environmental and clinical data relating to those factors is already being generated in laboratories all over the world. Yet up until recently it has been impossible to store and process such massive and diverse data sets.

Now an EU supported research project led by the University of Sheffield’s Centre for Computational Imaging & Simulation Technologies has started to analyse such unstructured data with patented Active Data Replication™ technology. This enables the large volumes of continuously changing data to be moved between 8 different cloud providers so it can be analysed by over 950 applications. The data is then used in a series of different computer platforms available throughout the EU. The first is the clinical research platform Multix which allows researchers to federate large amounts of information such as MRI scans, physiological data and patient histories. The data from Multix can then also be incorporated into a patient care platform (PCP) which allows doctors to input patient data and take advantage of all the knowledge already created by Multix. The PCP also helps doctors analyse their patients’ cognitive and motor skills as well as lifestyle and environmental factors. This then also feeds back into the Multix. Finally, an online citizens’ portal with games and questions also collects data from those at risk of developing dementia. The researchers hope to be able to combine this data with novel biomarkers to provide new and feasible ways to screen for dementia before symptoms appear. This would enable the provision of the right care at the right time whilst maximising the quality of life for the patients and reducing the burden on health systems.

Whilst the health sector often declares “war” on diseases and speaks in absolute terms such as “eradication”, the “fightback” against dementia may well lead to success. For the first time in history we not only have volumes of the right data but we also have the means to be able to analyse it.

— This feed and its contents are the property of The Huffington Post UK, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.