Examples of AI in hiring include:
- Advertising jobs to targeted applicants;
- Screening job applicants;
- Online interviewing with “chat” boxes;
- Using computerized screening tests that measure applicants’ skills, abilities and personalities; and
- Scoring applicants’ resumes.
The guidance clarifies that employers are responsible for ensuring their hiring technologies—including any embedded AI—comply fully with the ADA, even if the technology is administered by a third party. Regardless of intent, if the use of a technology has the effect of screening out applicants with disabilities, or adversely impacting individuals with a disability, the employer may be violating the ADA. Similarly, if an applicant requests a reasonable accommodation because the technology is not accessible due to the individual’s disability, the employer has an obligation to make a reasonable accommodation, even if the request is made to the third party utilizing the technology.
In light of this guidance, employers should:
- Examine computerized hiring tools to ensure algorithms in the AI do not unfairly screen out individuals with disabilities;
- Ensure the technology is accessible to all individuals with disabilities;
- Provide clear information and procedures to job applicants for requesting a reasonable accommodation, and ensure that requesting an accommodation does not decrease an applicant’s chances of being hired;
- Screen technology vendors carefully to ensure their compliance with the ADA;
- Keep confidential all requests for accommodations, accommodations dialogues and store related information in a confidential “medical” file.
As technology becomes more entrenched in the hiring process, employers need to ensure that automation does not inadvertently lead to disability discrimination pitfalls. Employers should consult with their legal counsel with any questions concerning ADA compliance in general and AI in particular. If you have any questions, please contact a member of our Labor & Employment Team.