By Dr. Chase Spurlock

 

The healthcare industry has firmly embraced big data, but it will be machine learning and deep learning applied to big data that will drive innovation over the next decade.

Skilled medical researchers and physicians will always be at the heart of healthcare. The future, however, will see a larger role for interdisciplinary teams of clinicians, computational biologists, bioinformaticists, software developers, and team members loosely identified as “data scientists.” Data scientists will be charged with integrating input from these key stakeholders to build analytics platforms capable of delivering insights. Today, we’re seeing the early formation of these teams as they are enhancing healthcare decision-making using machine learning algorithms and creating new tools that form the basis of artificial intelligence systems capable of revolutionizing the healthcare profession and delivering on the promise of precision medicine.

Across each point of contact in the healthcare continuum – research data, patient-reported information, provider notes, and payer claims data – there is no shortage of information to comb through as we attempt to identify patterns that define human health and disease. The immediate challenge facing the healthcare industry is this: How do we make sense of the mountains of data to predict outcomes and optimize care for patients over the life of their conditions?


Uses of big data

Companies are using data to predict emergency room wait times down to the minute, facilitate information sharing among hospital departments to improve patient care and deliver real-time alerts to people wearing health-tracking devices. These achievements were made possible by people who applied cutting-edge techniques to analyze data, see connections and develop tools that positively impact the delivery and efficacy of patient care.

Big data and machine learning are even influencing drug prescription habits. Locally, we are not immune to the opioid epidemic, and Nashville companies are using predictive analytics to identify patients who are highly susceptible to opioid abuse. This information allows the provider to determine how to best manage these patients and curb addiction.


Detecting and monitoring chronic disease

In the field of applied genomics, where we leverage very large datasets to look at a person’s genetic (molecular) makeup, enhanced computational abilities have already led to significant breakthroughs and accelerated the speed at which discoveries can be made. Today, it costs less than $1,000 and takes a fraction of the time to generate a high-quality “draft” whole human genome sequence. In comparison, it took 13 years and $2.7 billion to generate the first human genome sequence. While still too expensive to perform on every patient, it is conceivable that these technologies will become routine as healthcare integrates information from disparate sources to inform clinical decisions.

There’s no question that we’ve made great strides in our technological capabilities, but the reality is we’re just getting started. We have learned how to generate many unique datasets, but the greater challenge we face is integrating multiple data sources in a useable way to create insights. Combining the power of advances in cutting-edge genomic research with improvements in access to population-level data sets, like claims data and electronic health record (EHR) data, could produce better predictive or prescriptive tools than either alone. These data sources used independently have produced major tech breakthroughs that aid in the management of disease, but the combined power could further enhance our ability to decipher patterns that could inform the origins of diseases and lead to new ideas for how best to manage illness.

For example, our company, IQuity, using genomic data, which is a small sliver of available healthcare data, has deployed machine learning models to facilitate the creation of new diagnostic tests that can detect autoimmune or other chronic conditions, including multiple sclerosis, IBS, Crohn’s disease, ulcerative colitis and fibromyalgia. New machine learning tools leveraging these same datasets will soon be capable of monitoring these conditions throughout a patient’s life.

An estimated 50 million Americans suffer from an autoimmune disease, with direct healthcare costs totaling at least $100 billion annually. Our new tests and the promise of expanded analytics offerings can reduce misdiagnosis rates and accelerate the diagnostic process by months or even years, helping to identify patients earlier so physicians can prescribe effective treatment faster and improve the long-term outcomes for these patients.

As we move forward, we plan to leverage novel datasets that allow us to predict and monitor chronic diseases, like autoimmune diseases, and develop tools that give employers, payers and providers the ability to more effectively manage the time and cost consumed by treating these chronic patients.


A glimpse into the future

The move toward AI infrastructure in the healthcare industry is still in its infancy. As we look to optimize patient care and reduce ballooning costs, we are obliged to consider the use of data science to address these problems because they will not go away.

If it feels like these breakthroughs are coming at breakneck speed – they are. We’re truly in a renaissance period in medicine and in the life sciences in particular, where the potential of new technologies is vast. We should look to the coming years with excitement as we all work to advance the standard of patient care.

 


Chase Spurlock IQuityDr. Chase Spurlock is CEO and co-founder of IQuity, a Nashville-based data science company using genomic and proprietary healthcare data sets to detect and monitor chronic disease. For more information, visit www.iquity.com.