UW News
Aylin Caliskan
October 31, 2024
AI tools show biases in ranking job applicants’ names according to perceived race and gender
![A laptop with blank screen sits on a table.](https://uw-s3-cdn.s3.us-west-2.amazonaws.com/wp-content/uploads/sites/6/2024/10/30134318/alejandro-escamilla-N7XodRrbzS0-unsplash-150x150.jpg)
University of Washington researchers found significant racial, gender and intersectional bias in how three state-of-the-art large language models ranked resumes. The models favored white-associated names 85% of the time, female-associated names only 11% of the time, and never favored Black male-associated names over white male-associated names.
November 29, 2023
AI image generator Stable Diffusion perpetuates racial and gendered stereotypes, study finds
![Four images created by AI image generator Stable Diffusion with the prompt "person from Oceania" show four light-skinned people.](https://uw-s3-cdn.s3.us-west-2.amazonaws.com/wp-content/uploads/sites/6/2023/11/28133210/2x2-oceania-150x150.png)
University of Washington researchers found that when prompted to make pictures of “a person,” the AI image generator over-represented light-skinned men, failed to equitably represent Indigenous peoples and sexualized images of certain women of color.