When asked to generate resumes for people with female names, such as Allison Baker or Maria Garcia, and people with male names, such as Matthew Owens or Joe Alvarez, ChatGPT made female candidates 1.6 years younger, on average, than male candidates, researchers report October 8 in Nature. In a self-fulfilling loop, the bot then ranked female applicants as less qualified than male applicants, showing age and gender bias.
But the artificial intelligence model’s preference for younger women and older men in the workforce does not reflect reality. Male and female employees in the United States are roughly the same age, according to U.S. Census data. What’s more, the chatbot’s
→ Continue reading at Science News