Reveals: AI Hiring Scores Could Be Treated Like Credit Checks

Reveals: AI Hiring Scores Could Be Treated Like Credit Checks

At a Glance

  • A lawsuit seeks to treat AI hiring scores as consumer reports under the FCRA.
  • The case targets Eightfold, a leading AI recruiting platform.
  • Plaintiffs argue they were denied jobs without seeing or correcting their scores.

Why it matters: If the court rules in their favor, AI hiring tools may face stricter transparency and consumer-rights requirements.

The lawsuit was filed on Wednesday in California state court by two women working in STEM. They claim that AI hiring screeners have filtered them out of jobs they were qualified for. “I’ve applied to hundreds of jobs, but it feels like an unseen force is stopping me from being fairly considered,” said Erin Kistler, one of the plaintiffs, in a press release. “It’s disheartening, and I know I’m not alone in feeling this way.”

A survey from the World Economic Forum notes that roughly 88% of companies now use some form of AI for initial candidate screening. The lawsuit specifically targets Eightfold, an AI human resources company that sells tools designed to help employers manage recruiting and hiring. Among its offerings is a tool that generates a numerical score predicting the likelihood that a candidate is a good match for a given role.

Eightfold’s “match score” is generated using information pulled from a variety of sources, including job postings, an employer’s desired skills, applications, and, in some cases, LinkedIn. The Model then provides a score ranging from zero to five that “helps predict the degree of match between a candidate and a job position.”

The lawsuit argues that this process effectively produces a “consumer report” under the Fair Credit Reporting Act (FCRA), a federal law passed in 1970 to regulate credit bureaus and background check companies. Because the score aggregates personal information and translates it into a ranking used to determine eligibility for “employment purposes,” the lawsuit claims Eightfold should be required to follow the same rules that apply to credit reporting agencies. Those rules include notifying applicants when such a report is being created, obtaining their consent, and giving them the chance to dispute any inaccurate information.

Eightfold’s response

Eightfold’s spokesperson said in an emailed statement that the company believes the allegations are without merit. “Eightfold’s platform operates on data intentionally shared by candidates or provided by our customers,” the spokesperson said. “We do not scrape social media and the like. We are deeply committed to responsible AI, transparency, and compliance with applicable data protection and employment laws.”

Despite the company’s statements, the lawsuit seeks a court order requiring Eightfold to comply with state and federal consumer reporting laws as well as financial damages. The plaintiffs argue that qualified workers across the country are being denied job opportunities based on automated assessments they have never seen and cannot correct.

Legal context

Jenny R. Yang, a lawyer for the case and former chair of the U.S. Equal Employment Opportunity Commission, said: “These are the very real harms Congress sought to prevent when it enacted the FCRA. As hiring tools evolve, AI companies like Eightfold must comply with these common-sense legal safeguards meant to protect everyday Americans.”

The case raises questions about how AI systems used in hiring are regulated. If the court accepts the plaintiffs’ argument, AI hiring scores could be subject to the same disclosure and dispute-resolution requirements that govern credit reports. That would force companies to provide applicants with more transparency about how their data is used.

Potential industry impact

  • Companies may need to redesign scoring algorithms to avoid creating consumer reports.
  • Employers could face additional compliance costs for data-protection and consumer-rights procedures.
  • Candidates might gain the right to see and challenge their AI-generated scores.

Timeline of events

Date Event
Wednesday Lawsuit filed in California state court
1970 FCRA enacted
2023 World Economic Forum reports 88% of companies use AI for screening

Key takeaways

  1. The lawsuit could set a precedent that AI hiring scores are treated as consumer reports.
  2. If successful, companies like Eightfold would need to obtain consent, notify users, and allow disputes.
  3. The outcome may increase transparency and fairness in AI-driven recruitment.
  4. Employers may need to adjust their hiring tools to comply with new legal obligations.
  5. want

The case is still in its early stages, and the court has yet to decide whether the FCRA applies to AI-generated hiring scores. The outcome will likely influence how AI hiring tools are built and regulated across the tech and business sectors.

Author

  • Morgan J. Carter covers city government and housing policy for News of Austin, reporting on how growth and infrastructure decisions affect affordability. A former Daily Texan writer, he’s known for investigative, records-driven reporting on the systems shaping Austin’s future.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *