Gender differences in beliefs about algorithmic fairness
The field of algorithmic fairness has highlighted ethical questions which may not have purely technical answers. For example, different algorithmic fairness constraints are often impossible to satisfy simultaneously, and choosing between them requires value judgments about which people may disagree. Here we investigate whether people's beliefs about algorithmic fairness correlate with their demographic backgrounds, a question of interest because computer science is demographically non-representative. If beliefs about algorithmic fairness correlate with demographics, and algorithm designers are demographically non-representative, decisions made about algorithmic fairness may not reflect the will of the population as a whole. We show in two separate surveys that gender correlates with beliefs about algorithmic fairness. For example, women are significantly less likely to favor including gender as a feature in an algorithm which recommends courses to students if doing so would make female students less likely to be recommended science courses.
READ FULL TEXT