Visualizing Variable Importance and Variable Interaction Effects in Machine Learning Models

08/09/2021
by   Alan Inglis, et al.
0

Variable importance, interaction measures, and partial dependence plots are important summaries in the interpretation of statistical and machine learning models. In this paper we describe new visualization techniques for exploring these model summaries. We construct heatmap and graph-based displays showing variable importance and interaction jointly, which are carefully designed to highlight important aspects of the fit. We describe a new matrix-type layout showing all single and bivariate partial dependence plots, and an alternative layout based on graph Eulerians focusing on key subsets. Our new visualizations are model-agnostic and are applicable to regression and classification supervised learning settings. They enhance interpretation even in situations where the number of variables is large. Our R package vivid (variable importance and variable interaction displays) provides an implementation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset