Insight

Algorithmic Exclusion

The Workday lawsuit and the future of AI in hiring.

Workday Lawsuit and the Future of AI in Hiring headline
BD

Bryan Driscoll

September 19, 2025 08:00 AM

Algorithmic bias is now a legal battleground. In Mobley v. Workday, a federal court allowed age discrimination claims to proceed against the HR software provider, framing the case as an early test of how civil rights law applies to AI-driven hiring. The plaintiffs argue that Workday’s screening technology filtered out older candidates across hundreds of job applications, often within minutes, before any human review.

AI is embedded across the hiring landscape—87% of employers now rely on it to evaluate candidates—but statutory frameworks haven’t kept pace with the scale or structure of automated decision-making. Mobley forces the courts to consider a key legal question: When an algorithm drives employment outcomes, can the vendor be held liable under anti-discrimination laws, even if the employer technically controls the final decision?

That question cuts to the heart of accountability in modern hiring, where algorithms don’t just assist. They decide.

Discrimination by Algorithm: What the Workday Suit Alleges

Derek Mobley, a seasoned professional with a degree from Morehouse College and experience across finance, IT, and customer service, submitted more than 100 job applications through Workday’s platform. He was rejected every time—often within minutes, sometimes in the middle of the night—without an interview.

He’s not alone. Four additional plaintiffs, all over 40, have joined the case, describing similar patterns of near-instant rejections and alleging that Workday’s applicant screening tools disproportionately excluded them based on age.

The claim centers on disparate impact. Plaintiffs argue that Workday’s algorithm functionally deprioritized older applicants, effectively excluding a protected class from employment consideration. According to the complaint, the discrimination flowed from the software’s design—ranking, screening, and filtering in ways that penalized age with no human intervention.

Workday denies responsibility. It maintains that the hiring decisions rest with its clients, not with the platform. The company argues that its technology simply compares candidate qualifications to client-defined job requirements and does not identify or act on protected characteristics.

That position hasn’t ended the case. A federal judge in California allowed the lawsuit to proceed as a collective action, rejecting Workday’s attempt to force individualized claims. While intentional discrimination claims were dismissed, the disparate impact allegations survived. The litigation now moves forward with the central premise intact: that exclusion was systemic, not accidental, and that algorithmic tools can trigger liability even when a vendor sits one step removed from the hiring decision.

Disparate Impact in the AI Era

The disparate impact framework under Title VII and the ADEA allows plaintiffs to challenge facially neutral employment practices that disproportionately harm protected groups. When older applicants are consistently screened out by a hiring system—even without explicit bias or intent—that practice can still trigger liability if the impact is significant and unjustified by business necessity.

For decades, this doctrine has functioned as a backstop against systemic exclusion. But AI introduces new friction.

Algorithmic tools often rely on proxy variables, inputs that correlate with protected traits like age or race without naming them outright. Educational background, location, word choice, and even resume formatting can operate as stand-ins for demographic characteristics. The result is a model that appears neutral on its face but consistently favors younger candidates, especially if trained on data reflecting a younger existing workforce.

That training data is central to the issue. When a model is built on past hiring decisions, it replicates whatever bias those decisions reflect.

If older applicants were historically overlooked, the algorithm may infer that youth is a positive hiring signal. And because many of these systems are black boxes—protected as proprietary or too complex to interpret—plaintiffs face an evidentiary wall. They can’t point to a specific exclusionary rule or score threshold; they have only patterns, outcomes, and statistical disparities.

In Mobley, the plaintiffs argue this is enough. They claim the volume and speed of rejections across hundreds of applications supports an inference that Workday’s system functionally screened them out based on age. Whether that’s enough for liability will depend on how courts apply the disparate impact standard to algorithmic systems.

The EEOC has taken steps to clarify that AI tools fall within the scope of Title VII enforcement. In recent guidance, the agency emphasized employers cannot contract away their obligations simply by outsourcing hiring to algorithmic systems. Precedent from earlier technological contexts—such as the abandoned Amazon resume screener that penalized female applicants—reinforces that AI systems often replicate, rather than resolve, historical bias.

Platform Liability and the Vendor Defense

Workday’s defense rests on role distinction. The company argues that it provides tools, not decisions. Clients choose whom to interview and hire. The software only executes client-defined criteria and delivers efficiencies at scale. That line may hold in theory, but in practice, courts are increasingly willing to examine where control ends and causation begins.

Plaintiffs in Mobley assert that Workday’s algorithm didn’t just assist employers—it functioned as a gatekeeper. They allege the tool didn’t merely flag candidates but filtered them, scored them, and eliminated them before any human interaction. That level of involvement reframes the software’s role from passive infrastructure to active participant.

Joint employer doctrine asks whether a third party exerts meaningful control over employment conditions. Vicarious liability principles evaluate whether an actor contributed to a harm in a foreseeable and proximate way. Even product liability analogies may be relevant here: if a product’s design predictably causes exclusion, should its maker be held accountable?

SaaS contracts often disclaim this kind of responsibility. Vendors like Workday typically assert that the employer controls job descriptions, qualification criteria, and final decisions.

But courts may weigh those disclaimers against operational reality. If the system’s architecture predictably yields discriminatory results, and if the vendor knows or should know that outcome is likely, then disclaimers may carry little weight.

What Employers and Counsel Should Do Today

The Mobley case may not yet define liability boundaries, but it has made one thing unavoidable: legal exposure grows as employers rely on systems they don’t fully understand. Waiting for a ruling won’t protect companies already using automated hiring tools. Counsel should act now to shape how those systems are deployed and defended.

The starting point is visibility. Most companies using third-party screening software haven’t audited how it scores candidates or what traits it may privilege. Legal teams should push for clarity:

● What criteria are being used to rank applicants?

● How does the system weigh experience versus education?

● Are rejection patterns monitored across age, race, or disability?

Vendor contracts deserve scrutiny. Boilerplate disclaimers that the client retains control carry less weight if the system effectively prevents candidates from reaching a human reviewer. Counsel should assess not just who owns the hiring decision on paper, but who shapes it in practice, and ensure contracts reflect those realities.

Risk mitigation also requires documentation. Employers should preserve records showing how hiring tools are configured, how often settings are reviewed, and when human intervention occurs. Counsel should work with HR and compliance leaders to formalize internal reviews, not as performative audits, but as evidence of diligence.

The role of outside counsel isn’t just to react when clients face litigation. It’s to ensure those clients can explain, with confidence and specificity, how their systems operate. That’s what opposing counsel will demand. That’s what a court will scrutinize. And that’s what this moment requires.

Building a Litigation-Ready AI Hiring Process

The most effective way to manage AI risk is to structure hiring systems with legal scrutiny in mind from the start. That means treating AI not as a replacement for hiring managers or recruiters but as a tool—one that supports decision-making, never substitutes for it.

The core problem in the Workday suit isn’t just the existence of bias—it’s the lack of meaningful human oversight. Plaintiffs allege they were screened out by an automated process that made categorical judgments with no person reviewing qualifications. That kind of delegation invites liability, especially when patterns of exclusion align with protected traits like age.

To mitigate that risk, employers need hiring systems that preserve decision-making authority for humans. That starts with designing workflows where AI performs a supporting role: sorting, flagging, or summarizing, not disqualifying. Any system that scores or ranks candidates should route borderline or outlier profiles for human review, particularly when the inputs or outputs correlate with age, disability, or race. Automating those steps may create efficiency, but it also strips out the discretion that protects employers from disparate impact claims.

Legal counsel should push for explainability at every layer of the process. That includes maintaining version histories of models, documenting how inputs are selected, and preserving logs that show how hiring decisions are made.

Ultimately, defensibility doesn’t come from disclaimers. It comes from structure. Employers that treat AI as a co-pilot are in the strongest position to defend their hiring decisions. And attorneys who help build that structure are raising the standard for what responsible use of AI in employment actually looks like.

Headline Image: iSTOCK/ skynesher

Related Articles

What California Divorce Law Changes Reveal About US Families


by Bryan Driscoll

Why economic trends, technology and globalization are redefining family law.

family law changes headline

The End of Background Circumstances


by Bryan Driscoll

This reaches beyond doctrinal cleanup. It signals a profound shift.

End of background circumstances balancing discrimination headline

Reddit’s Lawsuit Could Change How Much AI Knows About You


by Justin Smulison

Big AI is battling for its future—your data’s at stake.

Reddit Anthropic Lawsuit headline

Changes in California Employment Law for 2025


by Laurie Villanueva

What employers need to know to ensure compliance in the coming year and beyond

A pair of hands holding a checklist featuring a generic profile picture and the state of California

IN PARTNERSHIP

The Long, Short, Thick and Thin of It


by Avrohom Gefen

“Appearance discrimination” based on employees’ height and weight is the latest hot-button issue in employment law. Here’s a guide to avoid discrimination.

Woman stands in front of mirror holding suit jacket

IN PARTNERSHIP

The Compensation Situation


by Liz S. Washko

Pay discrimination has been outlawed for decades. Yet the issue has taken on new salience in recent years. Here’s what to know about compensation equity—and where the legal risk lies for companies.

Pay discrimination between man and woman working the same job

A Double Dose of Power


by Constance Endelicato

Women in the Legal and Medical Professions Can Work Together to Dismantle Gender Inequality

Blue background with red heels, a blue shoe, and circle with a missing triangle in the center

Aim High and Fly


by Khalil Abdullah

From a silent victim of hometown segregation to Air Force captain and lawyer of consummate skill, Karen Evans exemplifies leadership—and vows always to help those who seek to follow her path.

Attorney Karen Evans smiles with a airplane flying the background

Racial Discrimination Suit Against NFL May End in Arbitration


by John Ettorre

A former Miami Dolphins head coach is up against the NLF in a discrimination case that is on a path to arbitration; the NFL remains focused on equality for their diverse coaching staff.

Miami Dolphins former head coach Brian Flores’ on the field with crowd blurred

WATCH: A Landmark Win for LGBTQ Rights


by Best Lawyers

Two top employment attorneys join the CEO of Best Lawyers to discuss the landmark Supreme Court ruling protecting gay and transgender employees.

Group of diverse individuals gathered around a colorful rainbow

Unlocking the Supply Chain


by Brittany K. Lazzaro

How Supply-Chain Transparency—Legal, Regulatory, and via Increasing Consumer Scrutiny—Is Forcing Companies to Take a Hard Look at Forced Labor Worldwide.

Image showcasing the four different types of supply chain: Legal, Regulatory, Labor, and Delivery

Does Gender Play a Role in Selecting Expert Witnesses?


by Kristin Hackler

Though courts recognize gender stereotyping under Title VII, disparities in how expert witnesses are selected and paid reveal persistent bias.

Multiple black chairs all lined up for an event with three people

In the News: Connecticut


by Best Lawyers

A roundup of notable news of listed lawyers in your area.

Orange background with orange silhouette of a Connecticut basketball player

ACLU Says Facebook's Targeted Advertising Is Discriminatory


by Donald L. Sapir

By letting advertisers target men in job postings, Facebook may be contributing to gendered discrimination.

Facebook Logo background with two red figures one female and one male speaking

The Fighter From New York


by Justin Smulison

Benedict Morelli discusses recent successes and high-profile casework.

The Fighter of New York, Benedict Morelli

Proceed with Caution!


by Bernard J. Bobber

When Section 7 protections clash with employers’ duty to prevent workplace bigotry

Megaphone icon announcing something with masses of people in four corners

Trending Articles

The Family Law Loophole That Lets Sex Offenders Parent Kids


by Bryan Driscoll

Is the state's surrogacy framework putting children at risk?

family law surrogacy adoption headline

Best Lawyers 2026: Discover the Honorees in Brazil, Mexico, Portugal, South Africa and Spain


by Jamilla Tabbara

A growing international network of recognized legal professionals.

Map highlighting the 2026 Best Lawyers honorees across Brazil, Mexico, Portugal, South Africa and Sp

Unenforceable HOA Rules: What Homeowners Can Do About Illegal HOA Actions


by Bryan Driscoll

Not every HOA rule is legal. Learn how to recognize and fight unenforceable HOA rules that overstep the law.

Wooden model houses connected together representing homeowners associations

Holiday Pay Explained: Federal Rules and Employer Policies


by Bryan Driscoll

Understand how paid holidays work, when employers must follow their policies and when legal guidance may be necessary.

Stack of money wrapped in a festive bow, symbolizing holiday pay

Florida Rewrites the Rules on Housing


by Laurie Villanueva

Whether locals like it or not.

Florida Rewrites the Rules on Housing headline

Can a Green Card Be Revoked?


by Bryan Driscoll

Revocation requires a legal basis, notice and the chance to respond before status can be taken away.

Close-up of a U.S. Permanent Resident Card showing the text 'PERMANENT RESIDENT'

US Tariff Uncertainty Throws Canada Into Legal Purgatory


by Bryan Driscoll

The message is clear: There is no returning to pre-2025 normalcy.

US Tariff Uncertainty Throws Canada Into Legal Purgatory headline

New Texas Family Laws Transform Navigating Divorce, Custody


by Bryan Driscoll

Reforms are sweeping, philosophically distinct and designed to change the way families operate.

definition of family headline

What Is the Difference Between a Will and a Living Trust?


by Bryan Driscoll

A practical guide to wills, living trusts and how to choose the right plan for your estate.

Organized folders labeled “Wills” and “Trusts” representing estate planning documents

The 2026 Best Lawyers Awards in Chile, Colombia and Puerto Rico


by Jamilla Tabbara

The region’s most highly regarded lawyers.

Map highlighting Chile, Colombia and Puerto Rico for the 2026 Best Lawyers Awards

How Far Back Can the IRS Audit You?


by Bryan Driscoll

Clear answers on IRS statutes of limitations, recordkeeping and what to do if you are under review.

Gloved hand holding a spread of one-hundred-dollar bills near an IRS tax document

Uber’s Staged Accidents Lawsuit a Signal Flare for Future of Fraud Litigation


by Bryan Driscoll

Civil RICO is no longer niche, and corporate defendants are no longer content to play defense.

Uber staged car crash headline

Anthropic Class Action a Warning Shot for AI Industry


by Bryan Driscoll

The signal is clear: Courts, not Congress, are writing the first rules of AI.

authors vs anthropic ai lawsuit headline

Can You File Bankruptcy on Credit Cards


by Bryan Driscoll

Understanding your options for relief from overwhelming debt.

Red credit card on point-of-sale terminal representing credit card debt

Do You Need a Real Estate Attorney to Refinance?


by Bryan Driscoll

When and why to hire a real estate attorney for refinancing.

A couple sitting with a real estate attorney reviewing documents for refinancing their mortgage

Canadian Firms Explore AI, But Few Fully Embrace the Shift


by David L. Brown

BLF survey reveals caution despite momentum.

Canadian Firms Explore AI, But Few Fully Embrace the Shift headline