InstanceSHAP: An Instance-Based Estimation Approach for Shapley Values

17 Pages Posted: 17 Dec 2022

Date Written: December 4, 2022


Explanations are necessary objects of a decision-making problem. However, complex Machine learning (ML) models that are usually used to provide decisions lack explanations. Model-agnostic explanation methods are the solutions for this problem and can find the contribution of each variable to the prediction of any ML model. Among these methods, SHapley Additive exPlanations (SHAP) is the most commonly used explanation approach which is based on game theory and requires a background dataset when interpreting an ML model. In this study we evaluate the effect of the background dataset on the explanations. In particular, we propose a variant of SHAP, InstanceSHAP, that use instance-based learning to produce a background dataset for the Shapley value framework. More precisely, we focus on Peer-to-Peer (P2P) lending credit risk assessment and design an instance-based explanation model, which uses a more similar background distribution. Experimental results reveal that the proposed model can effectively improve the ordinary shapley values and provide more robust explanations.

Keywords: Feature attribution, Shapley values, Machine Learning, Explainability

Suggested Citation

Babaei, Golnoosh and Giudici, Paolo, InstanceSHAP: An Instance-Based Estimation Approach for Shapley Values (December 4, 2022). Available at SSRN: or

Golnoosh Babaei (Contact Author)

University of Pavia ( email )

Via San Felice
Pavia, Pavia 27100

Paolo Giudici

University of Pavia ( email )

Via San Felice 7
27100 Pavia, 27100

Do you have negative results from your research you’d like to share?

Paper statistics

Abstract Views
PlumX Metrics