🧀 BigCheese.ai

Social

AI overwhelmingly prefers white and male job candidates

🧀

A University of Washington study highlights significant potential for racial and gender bias in AI when used to screen resumes. Tests on three open-source language models showed they favored resumes from white-associated names 85% of the time and female-associated names 11% of the time, with Black men being the least preferred.

  • Study tested 3 open-source, large language models.
  • Models favored white names over others 85% of the time.
  • Female-associated names were preferred 11% of the time.
  • Black men's resumes were preferred least, nearly 0% of the time.
  • Results demonstrate intersectional bias in AI resume screening.