Best Split Nodes for Regression Trees

06/24/2019
by   Jason M. Klusowski, et al.
3

Decision trees with binary splits are popularly constructed using Classification and Regression Trees (CART) methodology. For regression models, at each node of the tree, the data is divided into two daughter nodes according to a split point that maximizes the reduction in variance (impurity) along a particular variable. This paper develops bounds on the size of a terminal node formed from a sequence of optimal splits via the infinite sample CART sum of squares criterion. We use these bounds to derive an interesting connection between the bias of a regression tree and the mean decrease in impurity (MDI) measure of variable importance---a tool widely used for model interpretability---defined as the weighted sum of impurity reductions over all nonterminal nodes in the tree. In particular, we show that the size of a terminal subnode for a variable is small when the MDI for that variable is large. Finally, we apply these bounds to show consistency of Breiman's random forests over a class of regression functions. The context is surprisingly general and applies to a wide variety of multivariable data generating distributions and regression functions. The main technical tool is an exact characterization of the conditional probabilities of the daughter nodes arising from an optimal split, in terms of the partial dependence function and reduction in impurity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset