NYC Has Outlawed Racist and Sexist AI-Based Hiring
Photo: Zia Soleil/Getty Images

NYC Has Outlawed Racist and Sexist AI-Based Hiring

A new law requires companies to prove their software isn't biased in hiring new employees

A new law taking effect Wednesday in New York City requires companies to ensure their software-driven hiring software is free from bias and to notify job candidates about those tools.

NBC News reports that the law, which aims to outlaw racism or sexism in AI-assisted hiring, is believed to be the first of its kind in the world. Companies would prove their software isn't biased through a third-party audit and are required to publish the results. AI-based hiring tools have been on the rise as more job candidates use services that send out their information to hundreds of companies. The Equal Employment Opportunity Commision says 83 percent of employers use some sort of automated tool for hiring. Ninety-nine percent of Fortune 500 companies do that, the Commission says.

Click here to sign up for G-Code, LEVEL's weekly newsletter.

As NBC reports, it's not entirely clear what the penalties will be for breaking the new hiring law. The law, as written, only says, "Violations of the provisions of the bill would be subject to a civil penalty." New York's Department of Consumer and Worker Protection will be in charge of enforcing the law. That might be a problem, according to Jake Metcalf, a researcher at the nonprofit AI for Data & Society. Metcalf told NBC that lawyers are advising some companies not to take the new law seriously. "There are quite a few employment law firms in New York that are advising their clients that they don’t have to comply, given the letter of the law, even though the spirit of the law would seem to apply to them," he said.

Related: The Challenge of Hiring Anti-Racist Employees

As far back as 2019, research has been clear that AI-based hiring has an anti-Black bias. The rapid increase in artificial intelligence software has only accelerated skepticism among Black candidates about bias in these kinds of tools.