New York City Issues Final Rule on AI Bias Audit Law and Delays Enforcement to July
On April 6, 2023, the New York City Department of Consumer and Worker Protection ("DCWP") published a Notice of Adoption of Final Rule to implement New York City's recent law mandating bias audits of AI-enabled systems used in employment and hiring decisions in New York City and the publication of the results of such audits (see prior blog post). The DCWP also announced that it would delay enforcement of the law and the Final Rule from April 15, 2023, until July 5, 2023.
By way of background, a bias audit is an impartial evaluation of automated employment decision tools ("AEDT") used by employers that assesses the likelihood of disparate impact against sex, race and ethnicity. The bias audit must be conducted by an "independent" auditor and must meet the specific prescriptive requirements set forth in the Final Rule.
The Final Rule attempts to provide clarity related to revised rules that were issued on December 15, 2022 (the "revised rules"). (Our client alert covering the revised rules is available here). Specifically, the Final Rule addresses: (i) the definition of "machine learning, statistical modeling, data analytics, or artificial intelligence;" (ii) the requirements for bias audits and data used in bias audits; and (iii) the requirements for publishing the results of bias audits. Specifics regarding these changes follow.
Defined Terms
The Final Rule modifies the definition of "machine learning, statistical modeling, data analytics, or artificial intelligence" to expand its scope by eliminating the requirement that the computer-based technique at issue refine "inputs and parameters . . . through cross-validation or by using training and testing data." In other words, the DCWP broadened the pool of covered AEDTs by eliminating the need for tools to be cross-validated.
Bias Audit Requirements
With respect to requirements for bias audits, the revised rules (prior version) required employers to calculate the selection rate and impact ratio for sex, race and ethnicity of covered individuals, as well as the selection rate and impact ratio for the intersectional categories of sex, race and ethnicity. Now, the Final Rule builds on these requirements and includes the additional requirement that employers indicate the number of individuals assessed by an AEDT not included in the required calculations because they fall within an unknown EEO category.
The Final Rule further clarifies that independent auditors may exclude an EEO category that represents less than 2% of the data being used for the bias audit from the required calculations for the impact ratio. If such a category is excluded, the summary of results must explain the independent auditor's justification for the exclusion, as well as the number of applicants and "scoring rate or selection rate" for the excluded category.
The Final Rule also includes additional examples of adequate bias audits and expands upon previous examples. For instance, one example in the Final Rule now makes clear that a bias audit is required not only when an AEDT is used for a final hiring decision, but also when it is used to screen candidates at an "early point in the application process." The examples also now include language indicating the number of individuals who were excluded from the required calculations for selection rates and impact ratios because they fell within an unknown EEO category.
Data Requirements
The revised rules permitted employers to use test data where historical data of a covered AEDT was insufficient. The Final Rule clarifies that historical data may be culled from one or more employers or employment agencies that use the covered AEDT but only if such employer or employment agency (i) has provided the independent auditor with historical data from its own use of the AEDT, or (ii) has never used the AEDT.
As an example, the Final Rule provides that an employer using an AEDT for the first time may rely on a bias audit conducted using either (i) historical data of other employers or employment agencies, or (ii) test data.
Published Results Requirements
Per the Final Rule, bias audit results must now include the number of individuals assessed by the AEDT at issue that fall within an unknown EEO category. The Final Rule also states that the "number of applicants or candidates" must be published. It is unclear whether this requirement is intended to apply to the total number of applicants or candidates assessed by the AEDT or merely to applicants or candidates that fall within an unknown EEO category.
Other regulatory bodies are moving forward to consider the use of bias audits and related tools to regulate AI systems. For example, the federal Department of Commerce's recent issuance of a request for comment on AI system accountability measures and policies could serve as the foundation for a broader federal AI audit obligation in the future. DWT will continue to provide updates regarding the New York City bias audit law and associated regulations across the country while helping clients address these issues. Additional relevant alerts from DWT are available under the Related Posts below.