HiResCAM: Explainable Multi-Organ Multi-Abnormality Prediction in 3D Medical Images

Understanding model predictions is critical in healthcare, to facilitate rapid real-time verification of model correctness and to guard against the use of models that exploit confounding variables. Motivated by the need for explainable models, we address the challenging task of explainable multiple abnormality classification in volumetric medical images. We propose a novel attention mechanism, HiResCAM, that highlights relevant regions within each volume for each abnormality queried. We investigate the relationship between HiResCAM and the popular model explanation method Grad-CAM, and demonstrate that HiResCAM yields better performance on abnormality localization and produces explanations that are more faithful to the underlying model. Finally, we introduce a mask loss that leverages HiResCAM to require the model to predict abnormalities based on only the organs in which those abnormalities appear. Our innovations achieve a 37 resulting in state-of-the-art weakly supervised organ localization of abnormalities in the RAD-ChestCT data set of 36,316 CT volumes. We also demonstrate on PASCAL VOC 2012 the different properties of HiResCAM and Grad-CAM on natural images. Overall, this work advances convolutional neural network explanation approaches and the clinical applicability of multi-abnormality modeling in volumetric medical images.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset