Investigating Potential Factors Associated with Gender Discrimination in Collaborative Recommender Systems

Masoud Mansoury, Himan Abdollahpouri, Jessie Smith, Arman Dehpanah, Mykola Pechenizkiy, Bamshad Mobasher

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

6 Citations (Scopus)

Abstract

The proliferation of personalized recommendation technologies has raised concerns about discrepancies in their recommendation performance across different genders, age groups, and racial or ethnic populations. This varying degree of performance could impact users' trust in the system and may pose legal and ethical issues in domains where fairness and equity are critical concerns, like job recommendation. In this paper, we investigate several potential factors that could be associated with discriminatory performance of a recommendation algorithm for women versus men. We specifically study several characteristics of user profiles and analyze their possible associations with disparate behavior of the system towards different genders. These characteristics include the anomaly in rating behavior, the entropy of users' profiles, and the users' profile size. Our experimental results on a public dataset using four recommendation algorithms show that, based on all the three mentioned factors, women get less accurate recommendations than men indicating an unfair nature of recommendation algorithms across genders.

Original languageEnglish
Title of host publicationProceedings of the 33rd International Florida Artificial Intelligence Research Society Conference, FLAIRS 2020
EditorsEric Bell, Roman Bartak
Pages193-196
Number of pages4
ISBN (Electronic)9781577358213
Publication statusPublished - 2020

Fingerprint Dive into the research topics of 'Investigating Potential Factors Associated with Gender Discrimination in Collaborative Recommender Systems'. Together they form a unique fingerprint.

Cite this