Explainable Subgradient Tree Boosting for Prescriptive Analytics in Operations Management
https://doi.org/10.1016/j.ejor.2023.08.037
37 Pages Posted: 28 Apr 2020 Last revised: 25 Oct 2023
Date Written: June 25, 2022
Abstract
Motivated by the success of gradient boosting approaches in machine learning and driven by the need for explainable prescriptive analytics approaches in operations management (OM), we propose subgradient tree boosting (STB) as an explainable prescriptive analytics approach to solving convex stochastic optimization problems that frequently arise in OM. The STB approach combines the well-known method of subgradient descent in function space with sample average approximation, and prescribes decisions from a problem-specific loss function, historical demand observations, and prescriptive features. The approach provides a decision-maker with detailed explanations for the prescribed decisions, such as a breakdown of individual features' impact. These explanations are particularly valuable in practice when the decision-maker has the discretion to adjust the recommendations made by a decision support system. We show how subgradients can be derived for common single-stage and two-stage stochastic optimization problems; demonstrate the STB approach's applicability to two real-world, complex capacity-planning problems in the service industry; benchmark the STB approach's performance against those of two prescriptive approaches - weighted sample average approximation (wSAA) and kernelized empirical risk minimization (kERM); and show how the STB approach's prescriptions can be explained by estimating the impact of individual features. The results suggest that the STB approach's performance is comparable to those of wSAA and kERM while also providing explainable prescriptions.
Keywords: Prescriptive Analytics, Machine Learning, Gradient Boosting, Explainable AI
Suggested Citation: Suggested Citation