Monday 13 June 2022
Fairness in Selection Problems

Abstract:

Data-driven decision-making algorithms are increasingly applied in many domains with high social impact, such as hiring, lending, or criminal justice. Recently, it was shown that such algorithms could lead to discrimination against certain demographic groups (e.g., they can discriminate by race, gender, or age). This led to a recent active line of research—called algorithmic fairness—which studies how to develop efficient algorithms with fairness guarantees. Most of the decision problems with high social impact mentioned above are essentially selection problems. In selection problems, the decision-maker must select a fixed fraction of the best candidates given their characteristics. The notion of a selection budget contrasts selection problems with classification problems typically studied in the algorithmic fairness literature.

In this thesis, we study the causes of discrimination in selection problems and the impact of fairness mechanisms on the utility of selection. Our first contribution considers a selection problem with candidates whose quality is measured with a group-dependent noise—a phenomenon called differential variance. We study the impact of differential variance on group representations and how standard group fairness mechanisms affect the selection utility in the presence of differential variance. Our second contribution proposes a game-theoretic model of a selection problem with differential variance. We assume strategic candidates who maximize the individual utility by making a costly effort. The effort induces their quality, measured by a (group-fair) decision-maker with group-dependent noise. We characterize the equilibrium of such a game. In our third contribution, we consider a multistage selection problem. We extend classical group fairness notions to a multistage setting and propose the notions of local (per stage) and global (final stage) fairness. We then introduce and study the price of local fairness which is the ratio of utilities induced by the globally fair algorithm to that of the locally fair algorithm.

Mis à jour le 2 June 2022