Optimal Sparse Decision Trees

Advances in Neural Information Processing Systems (NeurIPS), 2019

28 Pages Posted: 7 Oct 2020

See all articles by Xiyang Hu

Xiyang Hu

Carnegie Mellon University

Cynthia Rudin

affiliation not provided to SSRN

Margo Seltzer

affiliation not provided to SSRN

Date Written: 2019

Abstract

Decision tree algorithms have been among the most popular algorithms for interpretable (transparent) machine learning since the early 1980’s. The problem that has plagued decision tree algorithms since their inception is their lack of optimality, or lack of guarantees of closeness to optimality: decision tree algorithms are often greedy or myopic, and sometimes produce unquestionably suboptimal models. Hardness of decision tree optimization is both a theoretical and practical obstacle, and even careful mathematical programming approaches have not been able to solve these problems efficiently. This work introduces the first practical algorithm for optimal decision trees for binary variables. The algorithm is a co-design of analytical bounds that reduce the search space and modern systems techniques, including data structures and a custom bit-vector library. Our experiments highlight advantages in scalability, speed, and proof of optimality.

Keywords: Decision tree, Optimization, Interpretable models, Fairness

JEL Classification: C00, C14, C44, C53, C61, C8

Suggested Citation

Hu, Xiyang and Rudin, Cynthia and Seltzer, Margo, Optimal Sparse Decision Trees (2019). Advances in Neural Information Processing Systems (NeurIPS), 2019, Available at SSRN: https://ssrn.com/abstract=3676619

Xiyang Hu (Contact Author)

Carnegie Mellon University ( email )

Pittsburgh, PA
United States

HOME PAGE: http://www.andrew.cmu.edu/user/xiyanghu/

Cynthia Rudin

affiliation not provided to SSRN

Margo Seltzer

affiliation not provided to SSRN

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
178
Abstract Views
604
Rank
368,584
PlumX Metrics