Medical Imaging with AI Deep Learning

Despite the slow adoption of AI in healthcare, the industry is now profoundly influenced by the intersection of digital transformation (DX) and Artificial Intelligence (AI). As many provider organizations have undertaken the massive task of implementing electronic medical records, they are now looking to apply technology to improve clinical outcomes and operational efficiencies. Healthcare CIOs have just started to absorb and process DX, and now they need also to consider how analytics and AI fit into the picture. The new digital enterprise, by definition, will be AI-enabled. As AI and machine learning become more widely implemented in clinical practice, I expect evidence-based research in the field will help make its use more commonplace in places like the ER, pathology labs, and imaging.

Traditionally, AI is the overarching category and includes rules that are explicitly programmed by humans. The classes consist of if-then statements, knowledge graphs, expert systems, or anything symbolic. The first AI machine learned to play checkers because a programmer served it the game basics.  It could not, however, use those same rules to play chess. Machine Learning (ML) is a subset of AI, and Deep Learning (DL) is a subset of machine learning. Both methods use training and test data to create a model. They then go through an optimization process to find weights that make the model best fit the data. Either method can handle numeric (regression) and non-numeric (classification) problems. But that’s where the similarity ends.

Machine learning algorithms parse data, interpret and learn from that data, and make informed classifications or predictions based on what it learned. There are different models of learning: supervised, unsupervised, and reinforcement, to name a few. There are many types of regression algorithms, such as linear, non-linear, decision-tree, random forest, and many more. The user sends new data the model has not seen and examines the accuracy of the output. Human intervention is required to retrain the model if the actual product is not accurate enough. A human also has to do feature extraction, which means someone with domain knowledge needs to reduce the initial set of raw data into manageable groups for processing while still retaining the descriptors of the original data. Clean datasets and groups enable you to speed up training and keep processing power reasonable.

The term “Deep Learning” is essentially a rebranding of the decades-old name neural networks. Deep learning algorithms use layers to create an artificial neural network that learns and makes decisions on its own based on its error rate or loss function. Deep learning requires more compute power on specialized microprocessors such as a GPU or a Tensor Processing Unit (TPU). When you have massive amounts of data to train with, and if you have many features to consider, deep learning is the way to go. The reference to the descriptor “deep” means that the output from one transformation stage feeds another stage or many more to obtain the final output. The number of steps in between processing stages is referred to as “hidden” layers in a deep learning model. Another benefit is DL does feature extraction automatically, thereby saving the difficult task on the part of the data scientist.

The most prevalent use cases of DL include image recognition, facial recognition, natural language processing, handwriting transcription, recommendation engines, autonomous vehicles, game playing, contextual search, and medical diagnosis. For healthcare scenarios, one of the most promising areas of medical imaging is the area of radiology. Medical imaging use cases rely on a specialized mathematical operation called convolution instead of general matrix multiplication in the network steps. Convolutional Neural Networks (CNNs) are used to classify images, cluster them by similarity, and recognize what they see.

Here are some examples of real progress in using deep learning in healthcare:

  • Heart failure precursors using electrocardiogram data to help identify asymptomatic left ventricular dysfunction. Nature Medicine, Mayo Clinic.
  • Minimize ineffective therapies with predictions of how breast tumors respond to neoadjuvant chemotherapy (NAC). Journal of Digital Imaging, Columbia University, Irving Medical Center (CUIMC), New York.
  • Detection of intracranial hemorrhage (ICH) with a classification of subtypes on unenhanced head CT scans. Nature Biomedical Engineering, Massachusetts General Hospital.
  • MRI Image reconstruction from accelerated data acquisitions. Daniel Sodickson, MD, Ph.D., Vice-Chair Radiology, NYU.
  • Convolutional neural network algorithm for image analysis of skin lesions for cancer to aid physicians in melanoma detection. Annals of Oncology, Holger A. Haenssle, MD, Dermatology, Heidelberg University in Germany.
  • Level of lymphocyte infiltration of a tumor with a predictive score for the efficacy of immunotherapy. Lancet Oncology Journal, Cleveland Clinic, OH.

Digital Transformation (DX) means applying new technologies to change processes, customer experience, and value in radically new ways. It also means to disrupt rather than enhance existing technologies. When you combine DX with AI, you can transform many of the areas of medical imaging, most notably radiology. When a radiologist can use AI to filter out the noise and aggregate imaging findings with priors from other clinical departments, lab results, health history, exams, patient genomics, and social determinants of health; you create a profoundly efficient diagnostic hub.