As the use of artificial intelligence recruitment and hiring tools becomes more prevalent, it is important to remember that such processes are subject to anti-discrimination laws. Employers have an obligation to inspect such tools and processes for bias based on any protected class (including disability and age) and should have plans to provide reasonable accommodations during the recruitment and hiring process. On May 12, 2022, the Equal Employment Opportunity Commission and Justice Department issued guidance for the first time regarding the use of algorithms and artificial intelligence in employment-related decision making and the ways such tools may violate disability discrimination laws. The guidance clarifies that employers are responsible for ensuring that their hiring technologies – including any artificial intelligence used – comply fully with disability discrimination laws, even if the technology is administered by a third party, and that employers provide reasonable accommodation as needed. The guidance further provides that regardless of intent, if the artificial intelligence tool has the effect of screening out applicants with disabilities or adversely affecting individuals with disabilities, the employer may be violating disability discrimination laws. The guidance directs employers to be critical of artificial intelligence hiring tools that they use and recommends asking vendors a number of questions and “only develop and select tools that measure abilities and qualifications that are truly necessary for the job – even for people who are entitled to an on-the-job reasonable accommodation.” https://www.eeoc.gov/laws/guidance/americans-disabilities-act-and-use-software-algorithms-and-artificial-intelligence.
In addition, for the first time, the EEOC this month sued an employer related to its use of artificial intelligence hiring tools. Specifically, the EEOC sued three integrated companies providing English-language tutoring services under the iTutorGroup brand name for age discrimination for allegedly programming their online recruitment software to automatically reject older applicants because of their age. According to the EEOC’s press release regarding the lawsuit: “[The companies] hire thousands of tutors based in the United States each year to provide online tutoring from their homes or other remote locations. According to the EEOC’s lawsuit, in 2020, [the companies] programmed their tutor application software to automatically reject female applicants age 55 or older and male applicants age 60 or older. [The companies] rejected more than 200 qualified applicants based in the United States because of their age.”
Accordingly, employers need to examine their artificial intelligence recruitment and hiring tools now to ensure the algorithms in the tools do not unfairly screen out individuals based on their membership in a protected class.