Insight

EEOC Issues Guidance on Artificial Intelligence Hiring Tools

McGuireWoods Legal Insights, May 23, 2023

Miles Indest

Miles Indest

August 10, 2023 03:34 PM

On May 18, 2023, the U.S. Equal Employment Opportunity Commission (EEOC) issued a new technical assistance document titled “Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964.”

This technical assistance is part of the EEOC’s 2021 agencywide initiative to ensure that the use of software such as artificial intelligence (AI), machine learning and other emerging technologies in hiring and other employment decisions complies with the federal civil rights laws enforced by the agency. The new guidance builds on the Uniform Guidelines in the Employee Selection Procedures (UGESP) adopted by the EEOC in 1978, as well as guidance issued last year addressing issues of using artificial intelligence in hiring within the context of the Americans with Disabilities Act.

The technical assistance addresses the potential discriminatory impact of using algorithmic decision-making tools, defined as the computer analysis of data that an employer relies on, either partly or in whole, when making decisions about employment. The guidance highlights the following examples of such software available to employers:

  • resume scanners that prioritize applications using certain keywords;
  • employee-monitoring software that rates employees on the basis of their keystrokes or other factors;
  • virtual assistants or chatbots that ask job candidates about their qualifications and reject those who do not meet predefined requirements;
  • video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and
  • testing software that provides “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills or perceived “cultural fit” based on their performance on a game or on a more traditional test.

How can employers tell if their algorithmic decision-making tools are in danger of violating federal employment discrimination laws? According to the EEOC, any selection tools that create an adverse selection rate toward individuals of one or more protected characteristics can be indicative of discrimination. The technical guidance reminds employers that although AI systems have the appearance of objectivity, they are developed by humans and therefore are subject to the societal and personal biases that can create disparate outcomes in hiring.

The EEOC provides direction on how to evaluate the extent to which bias may permeate an employer’s automated process. The technical assistance directly states that the “four-fifths rule” can be applied to AI tools to help identify disparate impact. This test, described in detail in the UGESP, defines a selection rate for one group as “substantially” different than the selection rate of another group if their ratio is less than four-fifths (or 80%). For example, an employer’s hiring tool creates a selection rate of black applicants of 30%, while its selection rate of white applicants is 60%. Because the ratio of those two rates (30/60 or 50%) is lower than four-fifths, this selection rate for black applicants is substantially different than the selection rate for white applicants and could evidence discrimination against black applications.

The EEOC reiterates that the four-fifths rule is a good rule of thumb, but quickly dashes employers’ hopes of calculating their way into compliance with a simple formula. In some situations, the four-fifths rule will not be a reasonable substitute for a test of statistical significance — for example, where many selections are made, causing any ratio to be irreflective of the actual impact on different protected groups. As with traditional selection processes, employers should subject AI tools to holistic review; compliance with any one test cannot disprove discriminatory outcomes. The EEOC recommends that employers conduct self-analyses and audits on an ongoing basis. However, the EEOC makes it clear that employers need not discard their existing AI tools, but should make amendments to remedy discriminatory selection rates. Because algorithms can be adjusted, not doing so may open an employer up to liability.

Many employers may hope to circumvent these concerns by outsourcing AI hiring tools to third-party vendors. The technical assistance, however, states that employers may still be liable for their agents’ violations of federal employment discrimination laws. Employers therefore should take steps to determine if vendors or developers are building and auditing their AI tools for any discriminatory impact. The EEOC recommends asking vendors specifically if they relied on the four-fifths rule, or other court-approved standards like statistical significance, when auditing their product.

Tips and Takeaways

The technical assistance urges employers to take a hands-on approach to auditing AI usage in their hiring processes. The following tips may aid employers in that task:

  • Maintain human oversight of AI tools. Employers should ensure automated hiring processes are subject to consistent review, not just to make sure these tools are providing accurate insights, but also to ensure that they are not reflecting existing biases of individuals building and maintaining the tools. Performing self-audits is crucial for employers to prevent discriminatory hiring practices.
  • Do not delegate compliance to AI vendors. Employers should perform due diligence around which AI tools they implement by asking vendors pointed questions about testing and audit practices with a focus on disparate impact. Employers also should review their commercial contracts with AI vendors to ensure that indemnities and other contractual allocation of risk are properly addressed.
  • Continue organizational bias training. Both implicit and explicit bias training are essential to identify potentially discriminatory practices and should form the foundation when building meaningful audit procedures for hiring practices, especially automated decision-making tools.

For questions about how artificial intelligence presents both risks and opportunities for employers, contact the authors of this article.

Trending Articles

Introducing the 2026 Best Lawyers Awards in Australia, Japan, New Zealand and Singapore


by Jennifer Verta

This year’s awards reflect the strength of the Best Lawyers network and its role in elevating legal talent worldwide.

2026 Best Lawyers Awards in Australia, Japan, New Zealand and Singapore

Revealing the 2026 Best Lawyers Awards in Germany, France, Switzerland and Austria


by Jamilla Tabbara

These honors underscore the reach of the Best Lawyers network and its focus on top legal talent.

map of Germany, France, Switzerland and Austria

Effective Communication: A Conversation with Jefferson Fisher


by Jamilla Tabbara

The power of effective communication beyond the law.

 Image of Jefferson Fisher and Phillip Greer engaged in a conversation about effective communication

The 2025 Legal Outlook Survey Results Are In


by Jennifer Verta

Discover what Best Lawyers honorees see ahead for the legal industry.

Person standing at a crossroads with multiple intersecting paths and a signpost.

The Best Lawyers Network: Global Recognition with Long-term Value


by Jamilla Tabbara

Learn how Best Lawyers' peer-review process helps recognized lawyers attract more clients and referral opportunities.

Lawyers networking

Jefferson Fisher: The Secrets to Influential Legal Marketing


by Jennifer Verta

How lawyers can apply Jefferson Fisher’s communication and marketing strategies to build trust, attract clients and grow their practice.

Portrait of Jefferson Fisher a legal marketing expert

Is Your Law Firm’s Website Driving Clients Away?


by Jamilla Tabbara

Identify key website issues that may be affecting client engagement and retention.

Phone displaying 'This site cannot be reached' message

A Guide to Workers' Compensation Law for 2025 and Beyond


by Bryan Driscoll

A woman with a laptop screen reflected in her glasses

Best Lawyers Launches CMO Advisory Board


by Jamilla Tabbara

Strategic counsel from legal marketing’s most experienced voices.

Group photo of Best Lawyers CMO Advisory Board members

Common Law Firm Landing Page Problems to Address


by Jamilla Tabbara

Identify key issues on law firm landing pages to improve client engagement and conversion.

Laptop showing law firm landing page analytics

Changes in California Employment Law for 2025


by Laurie Villanueva

What employers need to know to ensure compliance in the coming year and beyond

A pair of hands holding a checklist featuring a generic profile picture and the state of California

New Employment Law Recognizes Extraordinary Stress Is Everyday Reality for NY Lawyers


by Bryan Driscoll

A stressed woman has her head resting on her hands above a laptop

Turn Visitors into Clients with Law Firm Website SEO That Converts


by Jamilla Tabbara

Learn how to create high-converting law firm landing pages that drive client engagement and lead generation.

Laptop screen displaying website tools to improve client conversion rates

Best Lawyers Introduces Smithy AI


by Jamilla Tabbara

Transforming legal content creation for attorneys and firms.

Start using Smithy AI, a content tool by Best Lawyers

SEO for Law Firms: Overcoming Common Challenges


by Jamilla Tabbara

Tackle common SEO challenges and take the next step with our guide, How to Make Your Law Firm Easier to Find Online.

Graphic image of a phone displaying SEO rankings, with positions 1, 2 and 3 on the screen

Medical Malpractice Reform Trends in Texas, Utah, Georgia and SC


by Bryan Driscoll

A fresh wave of medical malpractice reform is reshaping the law.

Medical Malpractice Reform Trends hed