Toward Fairness-Aware Gradient Boosting Decision Trees for Ranking
37 Pages Posted: 8 Mar 2024
Abstract
Ranking involves training models to prioritize items based on their relevance to a given query, and Gradient Boosting Decision Trees (GBDT)-based ranking methods stand out as a robust choice for addressing the intricacies of ranking tasks. Nonetheless, the inherent risk associated with fairness in ranking tasks requires the incorporation of fairness-aware considerations into GBDT-based models. As GBDT-based ranking methods are indifferentiable, widely used fairness-aware models do not embed GBDT-based ranking methods effectively. In response to this research gap, we propose a fairness-aware GBDT-based ranking method, namely individual fairness gradient boosting decision trees method for ranking (IFGBDTRank), to deal with these issues. To facilitate the differentiable GBDT-based model, we introduce the concept of Soft Decision Trees (SDT). We then integrate SDTs using gradient boosting techniques, leading to the development of Soft Gradient Boosting Decision Trees. Then we formulate soft gradient boosting decision trees for ranking problems. This ensemble referred to as GBDT-Rank, is designed for ranking problems and is formulated to create a differentiable model that maintains high-ranking performance. To address fairness-aware consideration in GBDT-Rank, we opt for fairness-aware ranking based on individual fairness, and incorporate an optimal transport-based regularize into GBDT-Rank to deal with individual fairness at the algorithm level, thus introducing IFGBDT-Rank. Comprehensive experiments on synthetic and real-world datasets confirm that IFGBDT-Rank maintains robust ranking performance while improving individual fairness simultaneously. Our code is available at https://github.com/dfhjdskfds/IFGBDT-Rank.
Keywords: Fairness-aware, Individual fairness, Soft decision trees, GBDT-based ranking
Suggested Citation: Suggested Citation