Jump to Content

Joint Multisided Exposure Fairness for Recommendation

Bhaskar Mitra
Chen Ma
Haolun Wu
Xue Liu
Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (2022)

Abstract

Prior research on exposure fairness in the context of recommender systems has focused mostly on disparities in the exposure of individual or groups of items to individual users of the system. The problem of how individual or groups of items may be systemically under or over exposed to groups of users, or even all users, has received relatively less attention. However, such systemic disparities in information exposure can result in observable social harms, such as withholding economic opportunities from historically marginalized groups (\emph{allocative harm}) or amplifying gendered and racialized stereotypes (\emph{representational harm}). Previously, \citet{diaz2020evaluating} developed the \emph{expected exposure} metric---that incorporates existing user browsing models that have previously been developed for information retrieval---to study fairness of content exposure to individual users. We extend their proposed framework to formalize a family of exposure fairness metrics that model the problem jointly from the perspective of both the consumers and producers. Specifically, we consider group attributes for both types of stakeholders to identify and mitigate fairness concerns that go beyond individual users and items towards more systemic biases in recommendation. Furthermore, we study and discuss the relationships between the different exposure fairness dimensions proposed in this paper, as well as demonstrate how stochastic ranking policies can be optimized towards said fairness goals.