Over the past year, the U.S. Equal Employment Opportunity Commission (EEOC) has ramped up its legal actions, filing more than 140 lawsuits related to employment discrimination. This marks a significant 50% increase from the year before. Most of these cases are directed at companies alleged to have engaged in discriminatory practices, bias, and harassment. Notably, there’s been less focus on the misuse of automation and AI in HR processes.
However, the EEOC has been paying close attention to the potential biases in AI-driven hiring systems over the last two years. The Commission is enhancing its capability to detect issues in these systems and is collaborating with HR software providers and employers to help them recognize possible discriminatory patterns in AI technology.
EEOC Chair Charlotte Burrows, speaking at a Brookings Institution forum, highlighted the difficulty in pinpointing AI-related discrimination in hiring. One challenge is that job applicants often remain unaware that automation played a role in their rejection. Burrows mentioned that the EEOC is training its staff to identify signs of AI involvement in job rejections, such as receiving rejection notices at unlikely hours or shortly after application submission.
The EEOC is also scrutinizing AI algorithms in employment screening, particularly gamified assessments, for potential biases. This extends to concerns about targeted job advertisements based on algorithmic predictions of individual characteristics.
Burrows pointed out a major challenge in addressing AI bias in HR: the complexity and lack of transparency in AI systems. Understanding these algorithms requires specific expertise, creating a divide between civil rights law and AI technology.
Another issue is the lack of diversity in AI development teams, which could lead to unintentional discrimination against certain groups. The EEOC’s first lawsuit regarding automated hiring discrimination was against iTutorGroup Inc. in 2022. The company was accused of using software that automatically rejected applicants based on age, leading to a $365,000 settlement for the affected applicants.
Following this case, employment attorneys have been cautioning HR departments about the risks associated with automated decision-making systems. Jennifer Dunlap, an employment attorney, emphasized that employers cannot entirely depend on vendors’ assurances about their software’s compliance with anti-discrimination laws. She advised that employers must ensure their staff are well-trained and do not unintentionally introduce bias into these systems.