Insight

Algorithmic Exclusion

The Workday lawsuit and the future of AI in hiring.

Workday Lawsuit and the Future of AI in Hiring headline
BD

Bryan Driscoll

September 19, 2025 08:00 AM

Algorithmic bias is now a legal battleground. In Mobley v. Workday, a federal court allowed age discrimination claims to proceed against the HR software provider, framing the case as an early test of how civil rights law applies to AI-driven hiring. The plaintiffs argue that Workday’s screening technology filtered out older candidates across hundreds of job applications, often within minutes, before any human review.

AI is embedded across the hiring landscape—87% of employers now rely on it to evaluate candidates—but statutory frameworks haven’t kept pace with the scale or structure of automated decision-making. Mobley forces the courts to consider a key legal question: When an algorithm drives employment outcomes, can the vendor be held liable under anti-discrimination laws, even if the employer technically controls the final decision?

That question cuts to the heart of accountability in modern hiring, where algorithms don’t just assist. They decide.

Discrimination by Algorithm: What the Workday Suit Alleges

Derek Mobley, a seasoned professional with a degree from Morehouse College and experience across finance, IT, and customer service, submitted more than 100 job applications through Workday’s platform. He was rejected every time—often within minutes, sometimes in the middle of the night—without an interview.

He’s not alone. Four additional plaintiffs, all over 40, have joined the case, describing similar patterns of near-instant rejections and alleging that Workday’s applicant screening tools disproportionately excluded them based on age.

The claim centers on disparate impact. Plaintiffs argue that Workday’s algorithm functionally deprioritized older applicants, effectively excluding a protected class from employment consideration. According to the complaint, the discrimination flowed from the software’s design—ranking, screening, and filtering in ways that penalized age with no human intervention.

Workday denies responsibility. It maintains that the hiring decisions rest with its clients, not with the platform. The company argues that its technology simply compares candidate qualifications to client-defined job requirements and does not identify or act on protected characteristics.

That position hasn’t ended the case. A federal judge in California allowed the lawsuit to proceed as a collective action, rejecting Workday’s attempt to force individualized claims. While intentional discrimination claims were dismissed, the disparate impact allegations survived. The litigation now moves forward with the central premise intact: that exclusion was systemic, not accidental, and that algorithmic tools can trigger liability even when a vendor sits one step removed from the hiring decision.

Disparate Impact in the AI Era

The disparate impact framework under Title VII and the ADEA allows plaintiffs to challenge facially neutral employment practices that disproportionately harm protected groups. When older applicants are consistently screened out by a hiring system—even without explicit bias or intent—that practice can still trigger liability if the impact is significant and unjustified by business necessity.

For decades, this doctrine has functioned as a backstop against systemic exclusion. But AI introduces new friction.

Algorithmic tools often rely on proxy variables, inputs that correlate with protected traits like age or race without naming them outright. Educational background, location, word choice, and even resume formatting can operate as stand-ins for demographic characteristics. The result is a model that appears neutral on its face but consistently favors younger candidates, especially if trained on data reflecting a younger existing workforce.

That training data is central to the issue. When a model is built on past hiring decisions, it replicates whatever bias those decisions reflect.

If older applicants were historically overlooked, the algorithm may infer that youth is a positive hiring signal. And because many of these systems are black boxes—protected as proprietary or too complex to interpret—plaintiffs face an evidentiary wall. They can’t point to a specific exclusionary rule or score threshold; they have only patterns, outcomes, and statistical disparities.

In Mobley, the plaintiffs argue this is enough. They claim the volume and speed of rejections across hundreds of applications supports an inference that Workday’s system functionally screened them out based on age. Whether that’s enough for liability will depend on how courts apply the disparate impact standard to algorithmic systems.

The EEOC has taken steps to clarify that AI tools fall within the scope of Title VII enforcement. In recent guidance, the agency emphasized employers cannot contract away their obligations simply by outsourcing hiring to algorithmic systems. Precedent from earlier technological contexts—such as the abandoned Amazon resume screener that penalized female applicants—reinforces that AI systems often replicate, rather than resolve, historical bias.

Platform Liability and the Vendor Defense

Workday’s defense rests on role distinction. The company argues that it provides tools, not decisions. Clients choose whom to interview and hire. The software only executes client-defined criteria and delivers efficiencies at scale. That line may hold in theory, but in practice, courts are increasingly willing to examine where control ends and causation begins.

Plaintiffs in Mobley assert that Workday’s algorithm didn’t just assist employers—it functioned as a gatekeeper. They allege the tool didn’t merely flag candidates but filtered them, scored them, and eliminated them before any human interaction. That level of involvement reframes the software’s role from passive infrastructure to active participant.

Joint employer doctrine asks whether a third party exerts meaningful control over employment conditions. Vicarious liability principles evaluate whether an actor contributed to a harm in a foreseeable and proximate way. Even product liability analogies may be relevant here: if a product’s design predictably causes exclusion, should its maker be held accountable?

SaaS contracts often disclaim this kind of responsibility. Vendors like Workday typically assert that the employer controls job descriptions, qualification criteria, and final decisions.

But courts may weigh those disclaimers against operational reality. If the system’s architecture predictably yields discriminatory results, and if the vendor knows or should know that outcome is likely, then disclaimers may carry little weight.

What Employers and Counsel Should Do Today

The Mobley case may not yet define liability boundaries, but it has made one thing unavoidable: legal exposure grows as employers rely on systems they don’t fully understand. Waiting for a ruling won’t protect companies already using automated hiring tools. Counsel should act now to shape how those systems are deployed and defended.

The starting point is visibility. Most companies using third-party screening software haven’t audited how it scores candidates or what traits it may privilege. Legal teams should push for clarity:

● What criteria are being used to rank applicants?

● How does the system weigh experience versus education?

● Are rejection patterns monitored across age, race, or disability?

Vendor contracts deserve scrutiny. Boilerplate disclaimers that the client retains control carry less weight if the system effectively prevents candidates from reaching a human reviewer. Counsel should assess not just who owns the hiring decision on paper, but who shapes it in practice, and ensure contracts reflect those realities.

Risk mitigation also requires documentation. Employers should preserve records showing how hiring tools are configured, how often settings are reviewed, and when human intervention occurs. Counsel should work with HR and compliance leaders to formalize internal reviews, not as performative audits, but as evidence of diligence.

The role of outside counsel isn’t just to react when clients face litigation. It’s to ensure those clients can explain, with confidence and specificity, how their systems operate. That’s what opposing counsel will demand. That’s what a court will scrutinize. And that’s what this moment requires.

Building a Litigation-Ready AI Hiring Process

The most effective way to manage AI risk is to structure hiring systems with legal scrutiny in mind from the start. That means treating AI not as a replacement for hiring managers or recruiters but as a tool—one that supports decision-making, never substitutes for it.

The core problem in the Workday suit isn’t just the existence of bias—it’s the lack of meaningful human oversight. Plaintiffs allege they were screened out by an automated process that made categorical judgments with no person reviewing qualifications. That kind of delegation invites liability, especially when patterns of exclusion align with protected traits like age.

To mitigate that risk, employers need hiring systems that preserve decision-making authority for humans. That starts with designing workflows where AI performs a supporting role: sorting, flagging, or summarizing, not disqualifying. Any system that scores or ranks candidates should route borderline or outlier profiles for human review, particularly when the inputs or outputs correlate with age, disability, or race. Automating those steps may create efficiency, but it also strips out the discretion that protects employers from disparate impact claims.

Legal counsel should push for explainability at every layer of the process. That includes maintaining version histories of models, documenting how inputs are selected, and preserving logs that show how hiring decisions are made.

Ultimately, defensibility doesn’t come from disclaimers. It comes from structure. Employers that treat AI as a co-pilot are in the strongest position to defend their hiring decisions. And attorneys who help build that structure are raising the standard for what responsible use of AI in employment actually looks like.

Headline Image: iSTOCK/ skynesher

Related Articles

UnitedHealth's Twin Legal Storms


by Bryan Driscoll

ERISA failures and shareholder fallout in the wake of a CEO’s death.

United healthcare legal storm ceo murder headline

The End of Background Circumstances


by Bryan Driscoll

This reaches beyond doctrinal cleanup. It signals a profound shift.

End of background circumstances balancing discrimination headline

Reddit’s Lawsuit Could Change How Much AI Knows About You


by Justin Smulison

Big AI is battling for its future—your data’s at stake.

Reddit Anthropic Lawsuit headline

High Egg Prices Crack Open Federal Showdown


by Bryan Driscoll

White House sues California over cage-free law.

california cage free egg lawsuit headline

Uber’s Staged Accidents Lawsuit a Signal Flare for Future of Fraud Litigation


by Bryan Driscoll

Civil RICO is no longer niche, and corporate defendants are no longer content to play defense.

Uber staged car crash headline

Why Skechers' $9.4B Private Equity Buyout Sparked Investor Revolt


by Laurie Villanueva

Shareholder anger, a lack of transparency and a 'surprising' valuation.

Skechers shareholder lawsuit headline

Texas Targets Bail Reform, Improving Victims' Rights in 2025 Criminal Laws


by Laurie Villanueva

Austin had one of the most consequential legislative sessions in recent history.

Bail definition with Texas Flag overlaid

Defining the Legal Future of AI: Interviews with Best Lawyers Honorees


by Jennifer Verta

Inaugural Best Lawyers honorees on AI’s impact, client concerns and the attorney’s role in setting standards.

Two stylized stars representing artificial intelligence and innovation

IN PARTNERSHIP

Coffey Burlington's Legal Expertise


by John Fields

Service. Integrity. Results.

Coffey Burlington 2025

Delaware Overhauls Corporate Law to Stem 'DExit'


by Bryan Driscoll

How Businesses and Lawyers Can Stay Sharp, Current and Prepared.

Delaware lawyer reading about the Senate Bill 21

US Commercial Litigation Trends for 2025


by Bryan Driscoll

As Business Priorities Evolve, So Too Does the Role of Litigation in the Corporate Playbook.

Two commercial litigation lawyers discuss the future of the practice

Why Jack Dorsey and Elon Musk Want to 'Delete All IP Law'


by Bryan Driscoll

This Isn’t Just a Debate Over How to Pay Creators. It’s a Direct Challenge to Legal Infrastructure.

Elon Musk and Jack Dorsey standing together Infront of the X logo

Florida’s CHOICE Act


by Michael J. Gore and Dallas F. Dorosy

Drastic Changes in Noncompete Agreements

Floridian pondering the newly passed non-compete agreement

Changes in California Employment Law for 2025


by Laurie Villanueva

What employers need to know to ensure compliance in the coming year and beyond

A pair of hands holding a checklist featuring a generic profile picture and the state of California

Family Law Wrestles With Ethics as It Embraces Technology


by Michele M. Jochner

Generative AI is revolutionizing family law with far-reaching implications for the practice area.

Microchip above animated head with eyes closed

The Future of Family Law: 3 Top Trends Driving the Field


by Gregory Sirico

How technology, mental health awareness and alternative dispute resolution are transforming family law to better support evolving family dynamics.

Animated child looking at staircase to beach scene

Trending Articles

2026 Best Lawyers Awards: Recognizing Legal Talent Across the United States


by Jamilla Tabbara

The 2026 editions highlight the top 5% of U.S. attorneys, showcase emerging practice areas and reveal trends shaping the nation’s legal profession.

Map of the United States represented in The Best Lawyers in America 2026 awards

Gun Rights for Convicted Felons? The DOJ Says It's Time.


by Bryan Driscoll

It's more than an administrative reopening of a long-dormant issue; it's a test of how the law reconciles the right to bear arms with protecting the public.

Firearms application behind jail bars

2026 Best Lawyers Awards in Canada: Marking 20 Years of Excellence


by Jamilla Tabbara

Honoring Canada’s most respected lawyers and spotlighting the next generation shaping the future of law.

Shining Canadian map marking the 2026 Best Lawyers awards coverage

How to Sue for Defamation: Costs, Process and What to Expect


by Bryan Driscoll

Learn the legal standards, costs and steps involved when you sue for defamation, including the difference between libel and slander.

Group of people holding papers with speech bubbles above them

Best Lawyers 2026: Discover the Honorees in Brazil, Mexico, Portugal, South Africa and Spain


by Jamilla Tabbara

A growing international network of recognized legal professionals.

Map highlighting the 2026 Best Lawyers honorees across Brazil, Mexico, Portugal, South Africa and Sp

Build Your Legal Practice with Effective Online Networking


by Jamilla Tabbara

How thoughtful online networking supports sustained legal practice growth.

Abstract web of connected figures symbolizing online networking among legal professionals

Blogging for Law Firms: Turning Content into Client Connections


by Jamilla Tabbara

How law firms use blogs to earn trust and win clients.

Lawyer typing blog content on laptop in office

Reddit’s Lawsuit Could Change How Much AI Knows About You


by Justin Smulison

Big AI is battling for its future—your data’s at stake.

Reddit Anthropic Lawsuit headline

How to Choose a Good Lawyer: Tips, Traits and Questions to Ask


by Laurie Villanueva

A Practical Guide for Your First-Time Hiring a Lawyer

Three professional lawyers walking together and discussing work

The 2026 Best Lawyers Awards in Chile, Colombia and Puerto Rico


by Jamilla Tabbara

The region’s most highly regarded lawyers.

Map highlighting Chile, Colombia and Puerto Rico for the 2026 Best Lawyers Awards

Common-Law Marriage in Indiana: Are You Legally Protected?


by Laurie Villanueva

Understanding cohabitation rights and common-law marriage recognition in Indiana.

Married Indiana couple in their home

AI Tools for Lawyers: How Smithy AI Solves Key Challenges


by Jamilla Tabbara

Understand the features and benefits within the Best Lawyers Digital Marketing Platform.

Legal professional editing profile content with Smithy AI

Alimony Explained: Who Qualifies, How It Works and What to Expect


by Bryan Driscoll

A practical guide to understanding alimony, from eligibility to enforcement, for anyone navigating divorce

two figures standing on stacks of coins

Unenforceable HOA Rules: What Homeowners Can Do About Illegal HOA Actions


by Bryan Driscoll

Not every HOA rule is legal. Learn how to recognize and fight unenforceable HOA rules that overstep the law.

Wooden model houses connected together representing homeowners associations

UnitedHealth's Twin Legal Storms


by Bryan Driscoll

ERISA failures and shareholder fallout in the wake of a CEO’s death.

United healthcare legal storm ceo murder headline

Why Skechers' $9.4B Private Equity Buyout Sparked Investor Revolt


by Laurie Villanueva

Shareholder anger, a lack of transparency and a 'surprising' valuation.

Skechers shareholder lawsuit headline