To tree or not to tree? Assessing the impact of smoothing the decision boundaries

October, 2022

Abstract

When analyzing a dataset, it can be useful to assess how smooth the decision boundaries need to be for a model to better fit the data. This paper addresses this question by proposing the quantification of how much should the 'rigid' decision boundaries, produced by an algorithm that naturally finds such solutions, be relaxed to obtain a performance improvement. The approach we propose starts with the rigid decision boundaries of a seed Decision Tree (seed DT), which is used to initialize a Neural DT (NDT). The initial boundaries are challenged by relaxing them progressively through training the NDT. During this process, we measure the NDT's performance and decision agreement to its seed DT. We show how these two measures can help the user in figuring out how expressive his model should be, before exploring it further via model selection. The validity of our approach is demonstrated with experiments on simulated and benchmark datasets.

Resource Type: