Supplemental materials for LEAP 2025 session AI and HR: Shaping the Future Together
Relying on an artificial intelligence-based service to screen résumés and select candidates to interview may seem like a great idea. AI vendors tout the ability of their algorithms to quickly weed out unqualified applicants.
They also suggest that using AI to screen candidates can insulate employers from discrimination charges. After all, bias can’t taint the hiring process if no one in HR or on a selection team reviews résumés and applications, right?
Maybe not. AI skeptics and a growing cadre of plaintiffs’ attorneys argue that instead of preventing hiring bias, relying on an AI algorithm may actually bake discrimination into the selection process. That’s what is alleged in one recent complaint filed with the Federal Trade Commission and another with the EEOC.
The complaint: The American Civil Liberties Union recently filed dual complaints with the FTC and the EEOC alleging that a personality assessment tool based on AI illegally screens out people with disabilities. Aon, a risk management company that provides HR solutions, sells the tool, as well as a video-interview system and a cognitive-assessment program.
Aon says its products improve the recruitment process and also increase the diversity of the candidate pool.
The ACLU complained to the FTC that Aon deceived its customers by claiming the products are “bias-free” when in reality they appear to discriminate. The ACLU alleges the screening program identifies as negative several applicant characteristics that its diagnostic criteria say are evidence of mental-health disabilities and autism. Thus, people with the characteristics indicating a diagnosis of having a mental-health condition or autism are flagged as poor job candidates.
How do the screening tools work? Applicants are asked to read a series of statements and then say whether they agree with them or not. The video-interview tool assesses the candidate’s performance and responses against diagnostic criteria. That is, the program may view candidate behavior such as looking down or gazing around the room instead of looking directly at the camera as disqualifying. Yet it’s well documented that many people with autism avoid looking others directly in the eye.
Finally, the tools also transcribe speech. The ACLU claims the program routinely makes more mistakes when transcribing Black candidates’ speech than that of other non-Black candidates.
If you use AI tools to screen candidates, take these steps to minimize your discrimination liability:
Interested in joining HR Employment Law Advisor? For a limited time, LEAP 2025 attendees can get an annual membership at 30% off at https://www.hremploymentlawadvisor.com/leap30.