You may have heard about the “Great Resignation” recently – the growing trend where employees are resigning in great numbers. Competition between employers is fierce to attract and hire top talent. To expedite this process, employers may opt to use software programs that streamline this process by employing artificial intelligence (“AI”) and algorithms to make staffing decisions.
Recently, however, the Department of Justice (“DOJ”) here and the Equal Employment Opportunity Commission (“EEOC”) here issued guidance on complying with the Americans with Disabilities Act (“ADA”) when such technology is employed. In deploying AI and algorithms to vet candidates, it is important that employers monitor their current hiring procedure for bias and to assess any areas within that process where reasonable accommodations should be offered during the hiring process.
Here are a few takeaways from the DOJ and EEOC guidelines:
- Software Issues: The software employers use to expedite the hiring process may include artificial intelligence assistants asking questions to screen prospective candidates; extracting and ranking resumes by keywords; technology that assesses candidates based on facial expressions and speech patterns; and testing software that scores candidates based on one’s disposition or skills. While this can fast-track the process for employers, it can also disadvantage job applications and employees with disabilities.
- For example, a chatbot that screens out prospective candidates because of gaps in their employment history, may violate the ADA if the employment gap was due to a disability or the need to undergo treatment; or
- For example, an employer who is hiring cashiers should ensure that the chatbot software it is using does not reject candidates who are unable to stand for long periods. Otherwise, a chatbot might eliminate an applicant who uses a wheelchair and may be entitled to accommodations (e.g., a cash register that is placed at a lower height).
- Accessibility: Employers must note that individuals with visual, auditory, or other impairments may face challenges when accessing online/interactive tools used by an employer in the hiring process. Accommodations should be made, where possible.
- Accommodation: To the extent that doing so does not present an undue hardship, employers should not rely exclusively on AI/algorithm technology to vet applicants if an individual’s disability renders these evaluation tools less reliable than they should be. Further, employers must ensure that any AI/algorithm technology is simultaneously deployed with the reasonable accommodation provided to applicants.
What Can Employers Do?
Employers deploying AI in their hiring process need to ensure that they:
- are providing reasonable accommodations where possible;
- are not inadvertently screening out individuals; and
- understand the other ways an AI/algorithm tool may violate the ADA.
It is critical for employers to understand that they cannot blame third-party vendors who provide these types of software for any violations of the ADA—employers are responsible.
Contact us for more information on assessing your hiring procedure by e-mailing us at info@mnklawyers.com.
This material is provided for informational purposes only. It is not intended to constitute legal advice, nor does it create a client-lawyer relationship between MNK Law and any recipient. Recipients should consult with counsel before taking any actions based on the information contained within this material.