New AI Recruitment Regulation: How NYC AI Law 144 Affects You

Katy Culver

Marketing @ Dover

June 1, 2023

7 min

The law went into effect on July 5, 2023. Violations mean massive fines.

*If you’re hiring remotely in the United States or in New York City, pay attention.

New York City has passed a new regulation that requires transparency into how companies use AI and automation in their hiring processes: NYC Law 144.

Dover is committed to helping our customers comply with NYC Law 144 and staying on top of new regulations around AI in hiring. That’s why we’re fully compliant as of June 26, 2023. And we’re willing to cover the cost for our customers to be compliant, too (more on that later).

Achieving compliance across all of your employment tools should be a top priority. Since NYC is a major talent market, compliance impacts companies hiring remotely nationwide. And other major geographies are likely to follow suit with similar regulation to mitigate bias in employment decisions.

This is the first of many regulations expected in the near future. COOs, risk officers, and recruiters specifically should pay attention to ensure your company will be compliant.

The law went into effect on July 5, 2023. Violations mean massive fines.

*If you’re hiring remotely in the United States or in New York City, pay attention.

New York City has passed a new regulation that requires transparency into how companies use AI and automation in their hiring processes: NYC Law 144.

Dover is committed to helping our customers comply with NYC Law 144 and staying on top of new regulations around AI in hiring. That’s why we’re fully compliant as of June 26, 2023. And we’re willing to cover the cost for our customers to be compliant, too (more on that later).

Achieving compliance across all of your employment tools should be a top priority. Since NYC is a major talent market, compliance impacts companies hiring remotely nationwide. And other major geographies are likely to follow suit with similar regulation to mitigate bias in employment decisions.

This is the first of many regulations expected in the near future. COOs, risk officers, and recruiters specifically should pay attention to ensure your company will be compliant.

What is NYC Law 144?

NYC Law 144 requires companies to limit the use of AI-powered employment tools to those that have published results of a customized bias audit.

The law applies to any HR tool that uses algorithms, machine learning, or AI to sort or evaluate applicants — known as Automated Employment Decision Tools (AEDTs).

Under Local Law 144, employers and employment agencies are prohibited from using AEDTs for employment decisions without conducting annual bias audits, publishing public summaries of the audits, and providing specific notices to applicants and employees subject to the screening.

Violations will result in fines up to $1500 per day for each tool that’s non-compliant.

What is a bias audit?

The goal of a “bias audit” is to understand whether or not AEDTs treat people differently based on their race, ethnicity, or sex.

That means you’ll have to hire an independent auditor to check if the AEDT's rules cause unfairness to certain groups of people. The law doesn't clarify how the audit should be done or what counts as passing or failing. But it does say that the audit needs to happen every year by someone who is not involved with the company, and a summary of the results needs to be posted on the company's website.

Bias audits review the selection rates and impact ratios across a defined set of categories to check for how they’re distributed throughout your process.

  • Selection rates measure how often candidates in each category are selected to move forward. The rate can be calculated by dividing the number of individuals in the category moving forward or assigned a classification by the total number of individuals in the category who applied for a position or were considered for promotion.

  • Impact ratios measure either (1) the selection rate for a category divided by the selection rate of the most selected category or (2) the scoring rate for a category divided by the scoring rate for the highest scoring category.

In order to be compliant, you need to measure selection rates and impact ratios across these groups:

  • Sex categories (e.g., selection of male candidates vs female candidates),

  • Race/Ethnicity categories (e.g., selection of Hispanic or Latino candidates vs Black or African American [Not Hispanic or Latino] candidates), and

  • Intersectional categories of sex, ethnicity, and race (e.g., selection of Hispanic or Latino male candidates vs. Not Hispanic or Latino Black or African American female candidates).

The tool’s end results, referred to as “simplified outputs”, like scores, rankings or recommendations, are also evaluated. Bias Audits must be conducted within 1 year of using the AI-assisted tools.

What you need to do to comply

To comply with the law, you must:

  • Publish the results of a Bias Audit for each AI employment tool 10 days before using the tool — or by June 26 for the tools you’re actively using, since the law goes into effect July 5). As a reminder, if you’re using Dover, we’ve conducted your first Bias Audit for you.

  • Pause use of any tools without a published audit until one is completed.

  • Ask tool vendors and software providers about compliance and auditing as you evaluate new HR solutions

How Dover helps with compliance

We have completed our own bias audit with an independent auditor. Our free tools like Dover’s ATS and Copilot Sourcing Extension have been evaluated to avoid unfair impacts based on gender, ethnicity and other attributes.

Good news: Dover automatically runs audits for all of our customers, even for companies using our free products.

So if you’re using Dover, we’ve got you covered. Check with any other AI-powered employment tools you’re using to see how they’re handling compliance.

Staying ahead of the curve on NYC Law 144 and similar regulations is important. Compliance will also help you ensure you’re running fair hiring processes while still getting the advantages that come with using the latest tech.

What is NYC Law 144?

NYC Law 144 requires companies to limit the use of AI-powered employment tools to those that have published results of a customized bias audit.

The law applies to any HR tool that uses algorithms, machine learning, or AI to sort or evaluate applicants — known as Automated Employment Decision Tools (AEDTs).

Under Local Law 144, employers and employment agencies are prohibited from using AEDTs for employment decisions without conducting annual bias audits, publishing public summaries of the audits, and providing specific notices to applicants and employees subject to the screening.

Violations will result in fines up to $1500 per day for each tool that’s non-compliant.

What is a bias audit?

The goal of a “bias audit” is to understand whether or not AEDTs treat people differently based on their race, ethnicity, or sex.

That means you’ll have to hire an independent auditor to check if the AEDT's rules cause unfairness to certain groups of people. The law doesn't clarify how the audit should be done or what counts as passing or failing. But it does say that the audit needs to happen every year by someone who is not involved with the company, and a summary of the results needs to be posted on the company's website.

Bias audits review the selection rates and impact ratios across a defined set of categories to check for how they’re distributed throughout your process.

  • Selection rates measure how often candidates in each category are selected to move forward. The rate can be calculated by dividing the number of individuals in the category moving forward or assigned a classification by the total number of individuals in the category who applied for a position or were considered for promotion.

  • Impact ratios measure either (1) the selection rate for a category divided by the selection rate of the most selected category or (2) the scoring rate for a category divided by the scoring rate for the highest scoring category.

In order to be compliant, you need to measure selection rates and impact ratios across these groups:

  • Sex categories (e.g., selection of male candidates vs female candidates),

  • Race/Ethnicity categories (e.g., selection of Hispanic or Latino candidates vs Black or African American [Not Hispanic or Latino] candidates), and

  • Intersectional categories of sex, ethnicity, and race (e.g., selection of Hispanic or Latino male candidates vs. Not Hispanic or Latino Black or African American female candidates).

The tool’s end results, referred to as “simplified outputs”, like scores, rankings or recommendations, are also evaluated. Bias Audits must be conducted within 1 year of using the AI-assisted tools.

What you need to do to comply

To comply with the law, you must:

  • Publish the results of a Bias Audit for each AI employment tool 10 days before using the tool — or by June 26 for the tools you’re actively using, since the law goes into effect July 5). As a reminder, if you’re using Dover, we’ve conducted your first Bias Audit for you.

  • Pause use of any tools without a published audit until one is completed.

  • Ask tool vendors and software providers about compliance and auditing as you evaluate new HR solutions

How Dover helps with compliance

We have completed our own bias audit with an independent auditor. Our free tools like Dover’s ATS and Copilot Sourcing Extension have been evaluated to avoid unfair impacts based on gender, ethnicity and other attributes.

Good news: Dover automatically runs audits for all of our customers, even for companies using our free products.

So if you’re using Dover, we’ve got you covered. Check with any other AI-powered employment tools you’re using to see how they’re handling compliance.

Staying ahead of the curve on NYC Law 144 and similar regulations is important. Compliance will also help you ensure you’re running fair hiring processes while still getting the advantages that come with using the latest tech.

What is NYC Law 144?

NYC Law 144 requires companies to limit the use of AI-powered employment tools to those that have published results of a customized bias audit.

The law applies to any HR tool that uses algorithms, machine learning, or AI to sort or evaluate applicants — known as Automated Employment Decision Tools (AEDTs).

Under Local Law 144, employers and employment agencies are prohibited from using AEDTs for employment decisions without conducting annual bias audits, publishing public summaries of the audits, and providing specific notices to applicants and employees subject to the screening.

Violations will result in fines up to $1500 per day for each tool that’s non-compliant.

What is a bias audit?

The goal of a “bias audit” is to understand whether or not AEDTs treat people differently based on their race, ethnicity, or sex.

That means you’ll have to hire an independent auditor to check if the AEDT's rules cause unfairness to certain groups of people. The law doesn't clarify how the audit should be done or what counts as passing or failing. But it does say that the audit needs to happen every year by someone who is not involved with the company, and a summary of the results needs to be posted on the company's website.

Bias audits review the selection rates and impact ratios across a defined set of categories to check for how they’re distributed throughout your process.

  • Selection rates measure how often candidates in each category are selected to move forward. The rate can be calculated by dividing the number of individuals in the category moving forward or assigned a classification by the total number of individuals in the category who applied for a position or were considered for promotion.

  • Impact ratios measure either (1) the selection rate for a category divided by the selection rate of the most selected category or (2) the scoring rate for a category divided by the scoring rate for the highest scoring category.

In order to be compliant, you need to measure selection rates and impact ratios across these groups:

  • Sex categories (e.g., selection of male candidates vs female candidates),

  • Race/Ethnicity categories (e.g., selection of Hispanic or Latino candidates vs Black or African American [Not Hispanic or Latino] candidates), and

  • Intersectional categories of sex, ethnicity, and race (e.g., selection of Hispanic or Latino male candidates vs. Not Hispanic or Latino Black or African American female candidates).

The tool’s end results, referred to as “simplified outputs”, like scores, rankings or recommendations, are also evaluated. Bias Audits must be conducted within 1 year of using the AI-assisted tools.

What you need to do to comply

To comply with the law, you must:

  • Publish the results of a Bias Audit for each AI employment tool 10 days before using the tool — or by June 26 for the tools you’re actively using, since the law goes into effect July 5). As a reminder, if you’re using Dover, we’ve conducted your first Bias Audit for you.

  • Pause use of any tools without a published audit until one is completed.

  • Ask tool vendors and software providers about compliance and auditing as you evaluate new HR solutions

How Dover helps with compliance

We have completed our own bias audit with an independent auditor. Our free tools like Dover’s ATS and Copilot Sourcing Extension have been evaluated to avoid unfair impacts based on gender, ethnicity and other attributes.

Good news: Dover automatically runs audits for all of our customers, even for companies using our free products.

So if you’re using Dover, we’ve got you covered. Check with any other AI-powered employment tools you’re using to see how they’re handling compliance.

Staying ahead of the curve on NYC Law 144 and similar regulations is important. Compliance will also help you ensure you’re running fair hiring processes while still getting the advantages that come with using the latest tech.

Subscribe to our blog

Try out our new Free ATS for Startups! Join 500+ companies hiring with Dover.

Get started