The New York City Department of Consumer and Worker Protection (“DCWP”) recently issued a Notice of Adoption of Final Rule (“Final Rule”) relating to the implementation of New York City’s law regulating the use of automated employment decision tools (“AEDT”) by NYC employers and employment agencies.

NYC’s Local Law 144 now takes effect on July 5, 2023.  As discussed in our prior post, Local Law 144 prohibits employers and employment agencies from using certain Artificial Intelligence (“AI”) tools in the hiring or promotion process unless the tool has been subject to a bias audit within one year prior to its use, the results of the audit are publicly available, and notice requirements to employees or job candidates are satisfied.

The issuance of DCWP’s Final Rule follows the prior release of two sets of proposed rules in September 2022 and December 2022.  The Final Rule’s most significant updates from the December 2022 proposal include an expansion of the definition of AEDTs and modifications to the requirements for bias audits.  Key provisions of the Final Rule are summarized below.

What is an Automated Employment Decision Tool?

Local Law 144 defines AEDTs as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” 

Under the Final Rule, “machine learning, statistical modeling, data analytics, or artificial intelligence” is defined as a group of mathematical, computer-based techniques that:

  1. generate a prediction, meaning an expected outcome for an observation, such as an assessment of a candidate’s fit or likelihood of success, or that generate a classification, meaning an assignment of an observation to a group, such as categorizations based on skill sets or aptitude; and
  2. for which a computer at least in part identifies the inputs, the relative importance placed on those inputs, and, if applicable, other parameters for the models in order to improve the accuracy of the prediction or classification.

The Final Rule clarifies which tools fall within the scope of the law by defining the phrase “to substantially assist or replace discretionary decision making” as:

  1. relying “solely on a simplified output (score, tag, classification, ranking, etc.) with no other factors considered”;
  2. using the tool’s “output as one of a set of criteria where the simplified output is weighted more than any other criterion in the set”; or
  3. using the tool’s “output to overrule conclusions derived from other factors including human decision-making.”

Employers and employment agencies should be aware of four categories of requirements related to bias audits:

  • (i) Structure and Required Calculations;
  • (ii) Permissible Data;
  • (iii) Independent Auditor; and
  • (iv) Publication of Results.

Each category is addressed in more detail below.

Structure and Required Calculations

The Final Rule details the structure and requirements for bias audits, with new requirements following prior releases of proposed rules.  An AEDT cannot be used if more than one year has passed since the most recent bias audit.  Bias audits must adhere to the following:

  • Where an AEDT selects individuals to move forward in the hiring process or classifies individuals into groups, the bias audit must:
    • (i) calculate the selection rate for each category;
    • (ii) calculate the impact ratio for each category; and
    • (iii) indicate the number of individuals the AEDT assessed who are not included because they fall within an unknown category (e.g., applicants who declined to disclose demographic data). 

Categories mirror the EEO-protected categories reported on the U.S. Equal Employment Opportunity Commission’s EEO-1 Component 1 report.  These categories include race, ethnicity, and sex.

  • Where the AEDT only scores individuals rather than selecting them, the bias audit must:
    • (i) calculate the median score for the full sample of applicants;
    • (ii) calculate the rate at which individuals receive a score above the sample’s median score in the each category/classification;
    • (iii) calculate the impact ratio for each category/classification; and
    • (iv) indicate the number of individuals the AEDT assessed who are not included because they fall within an unknown category. 

The impact ratio must be calculated either as (i) a selection rate for a category divided by the selection rate of the most selected category; or (ii) a scoring rate for a category divided by the scoring rate of the highest scoring category.  Impact ratio calculations may exclude a category that makes up less than 2% of the data being used for the bias audit.

The Final Rule also indicates that the required calculations described above must be conducted for standalone sex, race, and ethnicity categories (e.g., Male, Female, Hispanic or Latino, Black, Asian, White, etc.), as well as intersectional groupings (e.g., Black Females, White Males, etc.). 

Permissible Data

Bias audits must use “historical data,” which is defined as “data collected during an employer or an employment agency’s use of an AEDT to assess candidates for employment or employees for promotion.”  Under the Final Rule, a bias audit may rely on historical data of other employers or employment agencies, but only if the employer or employment agency (i) has provided the independent auditor with historical data from its own use of the AEDT or (ii) has never used the AEDT.

Alternatively, test data may be used if there is insufficient historical data available for a statistically significant bias audit.  If test data is used, a summary of results of the bias audit must explain why historical data was not used, as well as describe how the test data was generated and obtained.

Independent Auditor

An “independent auditor” must perform bias audits.  The Final Rule clarifies that an “independent auditor” is “a person or group that is capable of exercising objective and impartial judgment on all issues within the scope of a bias audit of an AEDT.”  An auditor is not independent if the auditor:

  • (i) is or was involved in using, developing, or distributing the AEDT;
  • (ii) has an employment relationship with an employer or employment agency that uses AEDT; or
  • (iii) has a direct or material indirect financial interest in an employer or employment agency that uses the AEDT. 

Similarly, an auditor is not independent if it has an employment relationship with or financial interest in a vendor that developed or distributes the AEDT.

Publication of Results

Local Law 144 requires that the results of a bias audit must be “made publicly available on the website of the employer or employment agency.”  The Final Rule clarifies that the published results — the date of the most recent bias audit, summary of results, and distribution date of the AEDT — must be posted on the employment section of the entity’s website in a “clear and conspicuous manner.” 

The summary of results must include the source and an explanation of the data used to conduct the audit; the number of individuals who fall within an unknown category; and the number of individuals, selection or scoring rates, and impact ratios for all categories.  If a category comprising less than 2% of the data being used for the bias audit is excluded from the required calculations for impact ratios, the summary must include the independent auditor’s justification, as well as the number of applicants and scoring rate or selection rate for the excluded category.

The published results must remain posted for at least six months after the AEDT was last used to make an employment decision.

Any changes to the notice requirements?

Local Law 144 requires that any employer or employment agency that uses an AEDT to screen an employee or a candidate who has applied for a position for an employment decision must notify individuals who reside in New York City that the AEDT will be used in connection with their assessment or evaluation, as well as the job qualifications and characteristics that the AEDT will consider.  Notice must be provided at least 10 business days before use of an AEDT and, notably, must include instructions for how to request an alternative selection process or accommodation.

The Final Rule’s notice provisions remain similar to those included in the proposed rules in September 2022 and December 2022.  Importantly, the Final Rule clarifies that Local Law 144 only requires employers or employment agencies to include how an individual might request alternative selection processes or accommodation to the extent such options are “available.”  While employers or employment agencies must still comply with reasonable accommodation obligations under other laws, the Final Rule states that “[n]othing under [Local Law 144] requires [them] to provide an alternative selection process.”

Looking Ahead

As an immediate next step, employers and employment agencies should identify whether their hiring and promotion efforts leverage AI tools that fall within the scope of NYC Local Law 144 and adjust their processes accordingly in advance of the July 5, 2023 enforcement date.

While NYC’s Local Law 144 is groundbreaking, it is likely only a small part of what will become an increasingly complex regulatory environment related to AI and machine learning.  Companies should prepare to comply with more laws and regulations as federal, state, and local legislatures take stock of the use of AI in various decision-making processes.  Covington will continue to monitor developments and publish relevant updates.  In the interim, if you have any questions about the material covered above, please contact Covington members of our Employment, Data Privacy, and Technology groups.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lindsay Burke Lindsay Burke

Lindsay Burke co-chairs the firm’s employment practice group and regularly advises U.S., international, and multinational employers on employee management issues and international HR compliance. Her practice includes advice pertaining to harassment, discrimination, leave, whistleblower, wage and hour, trade secret, and reduction-in-force issues arising…

Lindsay Burke co-chairs the firm’s employment practice group and regularly advises U.S., international, and multinational employers on employee management issues and international HR compliance. Her practice includes advice pertaining to harassment, discrimination, leave, whistleblower, wage and hour, trade secret, and reduction-in-force issues arising under federal and state laws, and she frequently partners with white collar colleagues to conduct internal investigations of executive misconduct and workplace culture assessments in the wake of the #MeToo movement. Recently, Lindsay has provided critical advice and guidance to employers grappling with COVID-19-related employment issues.

Lindsay guides employers through the process of hiring and terminating employees and managing their performance, including the drafting and review of employment agreements, restrictive covenant agreements, separation agreements, performance plans, and key employee policies and handbooks. She provides practical advice against the backdrop of the web of state and federal employment laws, such as Title VII of the Civil Rights Act of 1964, the Americans with Disabilities Act, the Equal Pay Act, the Family and Medical Leave Act, the Fair Labor Standards Act, and the False Claims Act, with the objective of minimizing the risk of employee litigation. When litigation looms, Lindsay relies on her experience as an employment litigator to offer employers strategic advice and assistance in responding to demand letters and agency charges.

Lindsay works frequently with the firm’s privacy, employee benefits and executive compensation, corporate, government contracts, and cybersecurity practice groups to ensure that all potential employment issues are addressed in matters handled by these groups. She also regularly provides U.S. employment law training, support, and assistance to start-ups, non-profits, and foreign parent companies opening affiliates in the U.S.

Photo of Libbie Canter Libbie Canter

Libbie Canter represents a wide variety of multinational companies on privacy, cyber security, and technology transaction issues, including helping clients with their most complex privacy challenges and the development of governance frameworks and processes to comply with global privacy laws. She routinely supports…

Libbie Canter represents a wide variety of multinational companies on privacy, cyber security, and technology transaction issues, including helping clients with their most complex privacy challenges and the development of governance frameworks and processes to comply with global privacy laws. She routinely supports clients on their efforts to launch new products and services involving emerging technologies, and she has assisted dozens of clients with their efforts to prepare for and comply with federal and state privacy laws, including the California Consumer Privacy Act and California Privacy Rights Act.

Libbie represents clients across industries, but she also has deep expertise in advising clients in highly-regulated sectors, including financial services and digital health companies. She counsels these companies — and their technology and advertising partners — on how to address legacy regulatory issues and the cutting edge issues that have emerged with industry innovations and data collaborations.

Photo of Jennifer Johnson Jennifer Johnson

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors…

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors, television companies, trade associations, and other entities on a wide range of media and technology matters. Jennifer has almost three decades of experience advising clients in the communications, media and technology sectors, and has held leadership roles in these practices for almost twenty years. On technology issues, she collaborates with Covington’s global, multi-disciplinary team to assist companies navigating the complex statutory and regulatory constructs surrounding this evolving area, including product counseling and technology transactions related to connected and autonomous vehicles, internet connected devices, artificial intelligence, smart ecosystems, and other IoT products and services. Jennifer serves on the Board of Editors of The Journal of Robotics, Artificial Intelligence & Law.

Jennifer assists clients in developing and pursuing strategic business and policy objectives before the Federal Communications Commission (FCC) and Congress and through transactions and other business arrangements. She regularly advises clients on FCC regulatory matters and advocates frequently before the FCC. Jennifer has extensive experience negotiating content acquisition and distribution agreements for media and technology companies, including program distribution agreements, network affiliation and other program rights agreements, and agreements providing for the aggregation and distribution of content on over-the-top app-based platforms. She also assists investment clients in structuring, evaluating, and pursuing potential investments in media and technology companies.

Photo of Teresa Lewi Teresa Lewi

Teresa Lewi represents and counsels companies on a wide range of federal, state, and local employment laws. She focuses her practice on trade secrets, non-competition, executive compensation, separation, employee mobility, discrimination, workplace privacy, and wage-and-hour issues.

Teresa represents clients in the life sciences…

Teresa Lewi represents and counsels companies on a wide range of federal, state, and local employment laws. She focuses her practice on trade secrets, non-competition, executive compensation, separation, employee mobility, discrimination, workplace privacy, and wage-and-hour issues.

Teresa represents clients in the life sciences, technology, financial services, sports, and entertainment industries. She has successfully tried cases in federal and state courts, and has resolved numerous disputes through alternative dispute resolution methods. In particular, Teresa has helped companies achieve highly favorable outcomes in high-stakes disputes over the protection of trade secrets and enforcement of agreements with employees. In addition, she defends companies against public accommodation and website accessibility claims under federal and state anti-discrimination laws.

Teresa also conducts specialized internal investigations and assessments designed to help companies protect their confidential information and trade secrets from employee misappropriation and cybersecurity incidents.

Photo of Micaela McMurrough Micaela McMurrough

Micaela McMurrough has represented clients in high-stakes antitrust, patent, trade secrets, contract, and securities litigation, and other complex commercial litigation matters, and serves as co-chair of Covington’s global and multi-disciplinary Internet of Things (IoT) group. She also represents and advises domestic and international…

Micaela McMurrough has represented clients in high-stakes antitrust, patent, trade secrets, contract, and securities litigation, and other complex commercial litigation matters, and serves as co-chair of Covington’s global and multi-disciplinary Internet of Things (IoT) group. She also represents and advises domestic and international clients on cybersecurity and data privacy issues, including cybersecurity investigations and cyber incident response. Micaela has advised clients on data breaches and other network intrusions, conducted cybersecurity investigations, and advised clients regarding evolving cybersecurity regulations and cybersecurity norms in the context of international law.

In 2016, Micaela was selected as one of thirteen Madison Policy Forum Military-Business Cybersecurity Fellows. She regularly engages with government, military, and business leaders in the cybersecurity industry in an effort to develop national strategies for complex cyber issues and policy challenges. Micaela previously served as a United States Presidential Leadership Scholar, principally responsible for launching a program to familiarize federal judges with various aspects of the U.S. national security structure and national intelligence community.

Prior to her legal career, Micaela served in the Military Intelligence Branch of the United States Army. She served as Intelligence Officer of a 1,200-member maneuver unit conducting combat operations in Afghanistan and was awarded the Bronze Star.

Photo of Carolyn Rashby Carolyn Rashby

Carolyn Rashby provides business-focused advice and counsel to companies navigating the constantly evolving and overlapping maze of federal, state, and local employment requirements. She conducts workplace investigations and cultural assessments, leads audits regarding employee classification, wage and hour, and I-9 compliance, advises on…

Carolyn Rashby provides business-focused advice and counsel to companies navigating the constantly evolving and overlapping maze of federal, state, and local employment requirements. She conducts workplace investigations and cultural assessments, leads audits regarding employee classification, wage and hour, and I-9 compliance, advises on employment issues arising in corporate transactions, and provides strategic counsel to clients on a wide range of workplace matters, including harassment and #MeToo issues, wage and hour, worker classification, employee accommodations, termination decisions, employment agreements, trade secrets, restrictive covenants, employee handbooks, and personnel policies. Her approach is preventive, while recognizing the need to set clients up for the best possible defense should disputes arise.

Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection…

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.

Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for connected devices, biometrics, online advertising, endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, e-mail marketing, disclosures of video viewing information, and new technologies.

Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based, global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.

Photo of Kareem Carryl Kareem Carryl

Kareem Carryl is an associate in the Congressional Investigations, Election and Political Law, and White Collar Defense and Investigations Practice Groups. He advises clients with internal investigations; and cooperating with and responding to high-profile investigations before Congress and the Department of Justice that…

Kareem Carryl is an associate in the Congressional Investigations, Election and Political Law, and White Collar Defense and Investigations Practice Groups. He advises clients with internal investigations; and cooperating with and responding to high-profile investigations before Congress and the Department of Justice that entail significant legal and reputational risks.

As a member of Covington’s Institutional Culture and Social Responsibility practice, Kareem guides clients undergoing civil rights and racial equity audits and assessments.

Kareem maintains an active pro bono practice.