What You Need to Know About NYC AI Law 144

Katy Culver
Marketing
June 1, 2023
•
7 min

What is NYC Law 144?
NYC Law 144 requires companies to limit the use of AI-powered employment tools to those that have published results of a customized bias audit.
The law applies to any HR tool that uses algorithms, machine learning, or AI to sort or evaluate applicants — known as Automated Employment Decision Tools (AEDTs).
Under Local Law 144, employers and employment agencies are prohibited from using AEDTs for employment decisions without conducting annual bias audits, publishing public summaries of the audits, and providing specific notices to applicants and employees subject to the screening.
Violations will result in fines up to $1500 per day for each tool that’s non-compliant.
What is a bias audit?
The goal of a “bias audit” is to understand whether or not AEDTs treat people differently based on their race, ethnicity, or sex.
That means you’ll have to hire an independent auditor to check if the AEDT's rules cause unfairness to certain groups of people. The law doesn't clarify how the audit should be done or what counts as passing or failing. But it does say that the audit needs to happen every year by someone who is not involved with the company, and a summary of the results needs to be posted on the company's website.
Bias audits review the selection rates and impact ratios across a defined set of categories to check for how they’re distributed throughout your process.
Selection rates measure how often candidates in each category are selected to move forward. The rate can be calculated by dividing the number of individuals in the category moving forward or assigned a classification by the total number of individuals in the category who applied for a position or were considered for promotion.
Impact ratios measure either (1) the selection rate for a category divided by the selection rate of the most selected category or (2) the scoring rate for a category divided by the scoring rate for the highest scoring category.
In order to be compliant, you need to measure selection rates and impact ratios across these groups:
Sex categories (e.g., selection of male candidates vs female candidates),
Race/Ethnicity categories (e.g., selection of Hispanic or Latino candidates vs Black or African American [Not Hispanic or Latino] candidates), and
Intersectional categories of sex, ethnicity, and race (e.g., selection of Hispanic or Latino male candidates vs. Not Hispanic or Latino Black or African American female candidates).
The tool’s end results, referred to as “simplified outputs”, like scores, rankings or recommendations, are also evaluated. Bias Audits must be conducted within 1 year of using the AI-assisted tools.
What you need to do to comply
To comply with the law, you must:
Publish the results of a Bias Audit for each AI employment tool 10 days before using the tool — or by June 26 for the tools you’re actively using, since the law goes into effect July 5). As a reminder, if you’re using Dover, we’ve conducted your first Bias Audit for you.
Pause use of any tools without a published audit until one is completed.
Ask tool vendors and software providers about compliance and auditing as you evaluate new HR solutions
How Dover helps with compliance
We have completed our own bias audit with an independent auditor. Our free tools like Dover’s ATS and Copilot Sourcing Extension have been evaluated to avoid unfair impacts based on gender, ethnicity and other attributes.
Good news: Dover automatically runs audits for all of our customers, even for companies using our free products.
So if you’re using Dover, we’ve got you covered. Check with any other AI-powered employment tools you’re using to see how they’re handling compliance.
Staying ahead of the curve on NYC Law 144 and similar regulations is important. Compliance will also help you ensure you’re running fair hiring processes while still getting the advantages that come with using the latest tech.
Kickstart recruiting with Dover's Recruiting Partners
