Skip to content

If your organisation uses Automated Employment Decision Tools, they must be audited for biases.

The New York City Local Law 144, also known as the “Automated Employment Decision Tools Act”, prohibits employers from using Automated Employment Decision Tools (AEDTs) in recruitment unless the tool is audited for bias annually.

The NYC law defines AEDTs as “any software, algorithm, or other technology that is used to make or substantially assist in making an employment decision, including but not limited to decisions regarding hiring, promotion, termination, compensation, or other terms and conditions of employment.

The NYC law requires employers who use AEDTs to:

  • Conduct a bias audit of the tool before it is used.
  • Publish a summary of the bias audit.
  • Provide notice to applicants and employees about the use of the tool.

The law also prohibits employers from using AEDTs that have a discriminatory impact on protected classes, such as race, ethnicity, gender, and disability.

The NYC LL 144 came into force on:




If your company uses AI-driven hiring or employee screening tools, you must take the prescribed steps to ensure compliance.

The NYC law is intended to address concerns about the potential for bias in AEDTs. AEDTs are trained on large datasets of data, and if these datasets are not representative of the population, then the AEDTs may be biased against certain groups of people. The law’s requirements for bias audits and notices are intended to help ensure that the automated tools used in employment are fair and unbiased.

How can Idiro help?

Bias Audit:

Our audit examines whether your hiring tools display unjustifiable differences between demographic groups (gender, race/ethnicity) in employment decisions. 

The audit evaluates if the AEDT model in use can distinguish between suitable and unsuitable candidates or if it uses sensitive attributes (gender or ethnicity) as the proxy for candidates’ validity.  

We calculate selection rates, impact ratios and scoring rates per group to compare gender, race/ethnicity and intersectional categories separately, as required by the Department of Consumer and Worker Protection (DCWP). 

The NYC Local Law 144 requires any organisation using AEDTs to publish the bias audit results. This leaves employers open to investigation by the EEOC if any biases are found. Testing your AEDT models with our audit will reveal if your models unjustly favour some candidates over others (e.g. males over females). Our audit will highlight where the AEDT is systematically rejecting good candidates (due to gender or race) or hiring subpar candidates.

Before using AEDT, an employer operating in New York City must have the AEDT audit summary publicly available on the employment section of their website.   The summary must include: 
  • The date of the most recent bias audit of the AEDT 
  • The source and explanation of the data used to conduct the bias audit 
  • The number of individuals that have been assessed by the AEDT and have fallen under the unknown category 
  • The number of applicants or candidates 
  • The selection rates, the scoring rates and the impact ratios for each category (gender, race/ethnicity and intersectional) 
  • The issue date of the AEDT used by the organisation 
We provide all the information outlined above in a ready-to-publish format.  The employer must keep the bias audit results and the issue date of the AEDT on their website for at least six months after its latest use of the tool for an employment decision.

At our company, we believe in promoting fairness and equity in the use of AI technologies. Contact Us today to learn more about our bias auditing service and how we can help your company comply with the new NYC Local Law 144 legislation.

Let’s discussAI Audit

Please fill in the form, and we will get back to you

Contact Us

Please fill in the form, and we will get back to you