By Linda Wilson, Managed Healthcare Executive | July 12, 2019

To take advantage of emerging software tools that incorporate artificial intelligence, healthcare organizations first need to overcome a variety of challenges.

Some leading-edge organizations are beginning to do just that, focusing on machine learning, a subset of artificial intelligence (AI) that encompasses statistical methods in which computer systems recognize patterns or correlations in data by ingesting large sets of training data. They improve their performance, or “learn,” over time as they incorporate new data; revising their approach as needed without human programmers updating the rules.

In the healthcare industry, most machine learning applications are in the research stage. “There is not a ton of clinical use,” according to Brian Edwards, independent validation consultant for AI vendors.

One area with a lot of research activity is radiology, where the industry is investigating how to use machine learning to detect signs of disease from digital images. “Wherever you have crisp clean data is where you should start. Images are the highest quality data that you have in a health system in terms of reliability,” Edwards says.

Machine learning has been applied to other areas, such as assessing patients’ risk of a hospital readmission, exacerbation of a chronic medical condition, or coming down with sepsis during a hospital stay.

Preparing for machine learning

“I think planning is where it starts,” says Bob Fuller, managing partner for healthcare at Clarity Insights, an information technology consulting firm focused on data analytics. Fuller says healthcare organizations should assess their overall business strategy and how AI could be deployed to solve specific problems, such as hospital readmissions or claims fraud. The next step is to allocate financial resources to transition the information technology infrastructure, so it becomes AI-ready, Fuller says.

This includes a large and diverse set of reliable data to train the machine learning models—whether they are developed internally or purchased from AI software vendors.

Related article: Six Healthcare Technologies Coming in the Next 10 Years

“In every situation, it is necessary to train the AI,” Edwards says. “That knowledge is really institution specific. The heterogeneity of data makes it very difficult, if not impossible, to take something from one organization—or a generic packaged product—and implement it widely. It really is implemented one-at-a-time in an almost ala carte type of way, where you need to customize it,” Edwards says.

Another barrier to implementing machine learning in healthcare organizations is access to high-quality data. Healthcare organizations need to have rigorous processes in place to ensure they have clean and well-defined data, Fuller says. While this has always been true, it becomes even more important as the volume and types of data that healthcare organizations capture continues to grow, he adds.

Real-world data

Geisinger Health—an integrated delivery network with 13 hospital campuses and a nearly 600,000-member health plan—has tapped into its vast store of diverse datasets to develop machine learning applications internally.

It has data on 2 million patients in its electronic health-records system. It also has a stable patient population in Pennsylvania and New Jersey, which allows it to build longitudinal datasets, spanning 20 years.

In its digital imaging system, it has two petabytes of data, which it accumulated over 19 years. Most of this data is from radiology, but some is from other medical disciplines, such as cardiology.

“The ability to have that data and to use if for machine learning is one of our strengths,” says Aalpen Patel, MD, chairman of radiology at Geisinger.

Some applications are in daily clinical use already, including a machine learning model that detects intracranial hemorrhage, or bleeding in the brain, from CT scans. Geisinger uses the model in daily operations to read CT scans of the brain taken in the outpatient setting. If the algorithm detects bleeding, the case is automatically reprioritized as a STAT case in radiologists’ work queue.

While all CT brain scans from hospital units or emergency departments are considered STAT and read within 30 minutes, similar scans taken in the outpatient setting are read within 12 hours. By sending those cases from the outpatient setting through the machine learning model, Geisinger reduces diagnosis times significantly for critical cases coming from the outpatient setting.

“We don’t take on problems that we don’t think are relevant,” says Brandon Fornwalt, MD, PhD, chairman of the department of imaging science and innovation at Geisinger. The integrated delivery network’s goal is to move from the research phase to implementation in clinical workflows quickly, he says.

In addition to large datasets, ample processing speed is necessary for machine learning. Geisinger solved this problem for its first machine learning application in imaging by purchasing a GPU (graphics processing unit), which accelerates the processing of computational workloads. Since then, it has upgraded its architecture to include multiple GPUs because its need for processing power has grown.

Fuller says accessing cloud options—such as from Amazon Web Services, Microsoft Azure, or Google Cloud—might be more cost effective for many healthcare organizations than building the required infrastructure internally because the cloud allows you scale computing resources up and down to match current needs.

Related article: Real-World Applications of Artificial Intelligence in Healthcare

Having a plan to evaluate machine learning products on the market today also is important. Salt Lake City-based Intermountain Healthcare, which has 23 hospitals and more than 170 clinics focuses its efforts on purchasing commercially available software, has investigated many more products than it has purchased, according to Lonny Northrup, senior health informaticist at Intermountain.

Northrup says Intermountain typically won’t move forward with a tool unless the vendor can point to clinical improvements and cost reductions that occurred at another health system.

After analyzing the results the platform achieved elsewhere, Intermountain assesses how well a product works with Intermountain’s patient data. To do this, the health system feeds the product a set of training data—such as on colon surgery—that the health system has already studied, allowing it to verify the insights that the tool derives.

The next step is a pilot test. For example, Intermountain is working with a vendor of patient engagement software on a clinical trial using the product to encourage patients with complex cases of congestive heart failure to follow their medication regimens. Using machine learning methods, the software platform personalizes the recommendations it makes about how to prod patients to behave in ways that improve their health. As part of the project, Intermountain provides 24/7 availability of clinical personnel to respond to these patients’ needs, Northrup says.

Original Article