Employers Find Openings to Share AI Bias Liability With Vendors (2024)

Early signs indicate that employers hit with hiring bias lawsuits over their use of artificial intelligence-based tools will get the opportunity to try to share liability with vendors who design the technology.

These tech developers have thus far avoided getting enmeshed in lawsuits when job applicants claim the human resources tools cause algorithmic discrimination, violating civil rights laws by blocking them from a fair shot at securing employment.

But novel legal arguments by job-seekers, as well as attempts by employers to share or pass off liability, could force vendors into the litigation fray just as the use of AI in recruitment and selection becomes both more common and closely scrutinized by federal agencies and the courts.

“Typically, vendors who provide selection tools don’t view themselves as really part of the employment process,” said Todd Horn, a partner at Venable LLP. “There has been in recent years a bigger push to hold the vendors of selection tools liable under employment discrimination laws as an agent of the employer.”

AI Vendors as Agents

Discrimination lawsuits against developers of AI hiring tools are in their infancy.

In one of these early cases, a job seeker sued HR software company Workday Inc. and claimed its automated tools rejected him from dozens of roles based on his race, age, and disability.

A California federal court on July 12 allowed some of the applicant’s bias claims to proceed under the argument that Workday could be liable as an agent of its employer-customers. The lawsuit adequately alleged that those customers delegated traditional employment tasks to Workday, Judge Rita Lin of the US District Court for the Northern District of California found.

Lin drew a distinction between Workday and a software vendor that provides a spreadsheet program that merely filters applicants based on employer criteria. That vendor would not be an agent because “the spreadsheet is not participating in the determination of which employees to hire,” she said.

“By contrast, Workday does qualify as an agent because its tools are alleged to perform a traditional hiring function of rejecting candidates at the screening stage and recommending who to advance to subsequent stages, through the use of artificial intelligence and machine learning,” Lin said.

However, she rejected a separate argument that an AI vendor could be liable as an “employment agency” under Title VII of the 1964 Civil Rights Act and other anti-bias laws—a stance backed by the US Equal Employment Opportunity Commission.

The complaint didn’t plausibly allege that Workday procures, or finds, candidates for employers—a key component in defining an employment agency, she wrote.

Even if other courts accept either the “agent” or “employment agency” theories, AI discrimination cases can be difficult to bring.

“Most workers don’t know they’ve been assessed by a discriminatory algorithmic tool or AI,” said Olga Akselrod, a senior staff attorney with the American Civil Liberties Union. “That makes workers much more vulnerable in this space.”

The ACLU has filed an EEOC charge against risk management company Aon Consulting Inc., claiming that the company’s hiring assessments discriminated based on race and disability.

ACLU has also taken the consumer protection route, filing a complaint with the Federal Trade Commission against Aon, claiming that it falsely advertised its algorithms as non-biased.

Shifting the Blame

Some employers who purchase AI hiring tools have been seeking to include protections in their contracts with vendors, such as indemnification provisions, said Jennifer Betts, co-chair of the technology practice group at Ogletree Deakins.

Those provisions would shift the cost of litigation from employer to vendor, but wouldn’t provide a discrimination liability shield to either.

“You cannot get indemnity for discrimination. It’s considered a public policy that you own and can’t pass off to someone else,” Horn said.

Many vendors are unwilling to sign indemnification agreements, he added.

When vendors do, they often place limits on the amount of money they are willing to pay—say three months of revenue or a specific dollar amount, said Dave Walton, co-chair of the AI, data and analytics team at Fisher Phillips LLP. They may also include provisions specifying how their software must be used for indemnification to be enforced.

AI developers outside of the hiring space have also begun to advertise indemnification provisions to companies. OpenAI, Adobe, and Microsoft have all said they will cover the cost of copyright lawsuits that companies face as a result of using their generative AI software.

Employers may also be able to sue vendors for breach of contract, if they believe bias in purchased software violates the terms of their purchasing agreement, according to Walton.

“I haven’t seen that happen yet, but it’s going to,” he said. “It’s a matter of time.”

There’s also a possibility of an employer looping an AI vendor into an existing discrimination lawsuit as a third-party defendant.

“If a vendor violates an agreement that the tool will not result in unlawful bias, the employer may be able to claim breach of contract, as well as unfair and deceptive trade practices,” said said Adam Forman, who co-leads the AI practice at Epstein Becker Green.

Aiding Discrimination

Another legal argument that could be used against vendors, Walton said, is that they are “aiding and abetting” hiring discrimination. That’s illegal in several states, including California and New Jersey.

“I’m sure it’s an argument someone is going to try to make,” Walton said.

Usually aiding and abetting cases are brought against supervisors or other individuals working for an employer. But the California Civil Rights Department has proposed modifying its fair employment regulations to clarify that the language applies to AI software vendors as well.

California civil rights law would allow the state to sue a company or let an affected individual obtain a right to sue letter to bring their own litigation. Judge Lin in the Workday case appeared open to the job applicant’s aiding and abetting claim under California law, allowing him to further amend his lawsuit to bolster that allegation.

State legislatures are also pursuing laws regulating AI hiring tools that could raise vendor concerns.

A bill signed into law by Colorado Gov. Jared Polis (D) in May requires tech developers to take “reasonable care” to protect state residents from discrimination. That means sharing information and documents about the AI system to the company deploying it.

State laws that hold vendors liable would open developers up to lawsuits not only from applicants, but from employers facing discrimination lawsuits and seeking a cross-claim, Walton said.

“It’s always good to be pointing the finger at someone else when you’re in front of a jury,” he said.

Other active bills in state legislatures would require employers to conduct audits and impact assessments, and make disclosures to applicants.

New York City implemented a first-of-its-kind law in 2023 that imposes penalties on employers who use AI hiring tools without a bias audit. Although the vendor can conduct the audit itself, the responsibility for compliance rests with the employer.

HR tech companies have gotten involved in shaping state lawmakers’ proposals. Workday has provided “input in the form of technical language” for some state-level legislation, a company spokesperson told Bloomberg Law in March.

Ultimately, if they’re not already, developers are going to be forced to more deeply consider discrimination and its legal implications more in the future, according to Shea Brown, CEO of BABL AI, a consulting firm that advises on the ethical use of the technology.

“We’re going to see that vendors are going to have a pretty high bar that they’re going to have to meet because employers are going to be worried about this issue,” he said. “Vendors are going to have to step up.”

Employers Find Openings to Share AI Bias Liability With Vendors (2024)
Top Articles
Latest Posts
Article information

Author: Sen. Emmett Berge

Last Updated:

Views: 5864

Rating: 5 / 5 (80 voted)

Reviews: 87% of readers found this page helpful

Author information

Name: Sen. Emmett Berge

Birthday: 1993-06-17

Address: 787 Elvis Divide, Port Brice, OH 24507-6802

Phone: +9779049645255

Job: Senior Healthcare Specialist

Hobby: Cycling, Model building, Kitesurfing, Origami, Lapidary, Dance, Basketball

Introduction: My name is Sen. Emmett Berge, I am a funny, vast, charming, courageous, enthusiastic, jolly, famous person who loves writing and wants to share my knowledge and understanding with you.