Ahead of its December 8 board meeting, the California Privacy Protection Agency (CPPA) has issued draft risk assessment regulations.  The CPPA has yet to initiate the formal rulemaking process and has stated that it expects to begin formal rulemaking next year, at which time it will also consider draft regulations covering “automated decisionmaking technology” (ADMT), cybersecurity audits, and revisions to existing regulations.  Accordingly, the draft risk assessment regulations are subject to change.  Below are the key takeaways:

When a Risk Assessment is Required: The draft regulations would require businesses to conduct a risk assessment before processing consumers’ personal information in a manner that “presents significant risk to consumers’ privacy.”  The draft regulations identify several activities that would present such risk:

  • Selling or sharing personal information;
  • Processing sensitive personal information (except in certain situations involving employees and independent contractors);
  • Using ADMT (1) for a decision that produces legal or similarly significant effects concerning a consumer, (2) to profile a consumer who is acting in their capacity as an employee, independent contractor, job applicant, or student, (3) to profile a consumer while they are in a public place, or (4) for profiling for behavioral advertising; or
  • Processing a consumer’s personal information if the business has actual knowledge the consumer is under 16.

The draft regulations also contemplate imposing risk assessment requirements where a business processes consumers’ personal information to train ADMT or artificial intelligence used for certain purposes, including:

  • The uses of ADMT described above;
  • Establishing individual identity based on biometric information;
  • Facial-, speech-, or emotion-detection;
  • Generating deep fakes; or
  • Operating generative models, including large language models.

The draft regulations would require businesses engaged in such training-related processing to make additional disclosures to “recipient-businesses” that are given access to the resulting ADMT.  The draft regulations would also require businesses to document their compliance with applicable disclosure requirements in their risk assessments.

Risk Assessments Requirements

The draft regulations would require that risk assessments contain a range of information including, for example:

  • A description of the data to be processed and the “operational elements” of the processing;
  • An evaluation of the benefits associated with the processing for the business, consumers, and others, as compared against any negative privacy impacts on consumers; and
  • A description of the safeguards that the business will put in place to mitigate the processing’s negative privacy impacts on consumers.

As mentioned above, the draft regulations would require additional disclosures for businesses using ADMT for certain purposes.  In such cases a business’ risk assessment would need to provide additional information, including:

  • An explanation of how the business evaluates the “validity, reliability, and fairness” of the ADMT;
  • An explanation of whether the business evaluated other versions of the ADMT for validity, reliability, and fairness, and why it selected the specific ADMT;
  • A description of the need for human involvement (or lack thereof) in the business’ use of the ADMT; and
  • An explanation of the ADMT’s logic, including any assumptions. 
Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection…

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.

Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for connected devices, biometrics, online advertising, endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, e-mail marketing, disclosures of video viewing information, and new technologies.

Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based, global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.

Photo of Libbie Canter Libbie Canter

Libbie Canter represents a wide variety of multinational companies on managing privacy, cyber security, and artificial intelligence risks, including helping clients with their most complex privacy challenges and the development of governance frameworks and processes to comply with U.S. and global privacy laws.

Libbie Canter represents a wide variety of multinational companies on managing privacy, cyber security, and artificial intelligence risks, including helping clients with their most complex privacy challenges and the development of governance frameworks and processes to comply with U.S. and global privacy laws. She routinely supports clients on their efforts to launch new products and services involving emerging technologies, and she has assisted dozens of clients with their efforts to prepare for and comply with federal and state laws, including the California Consumer Privacy Act, the Colorado AI Act, and other state laws. As part of her practice, she also regularly represents clients in strategic transactions involving personal data, cybersecurity, and artificial intelligence risk and represents clients in enforcement and litigation postures.

Libbie represents clients across industries, but she also has deep expertise in advising clients in highly-regulated sectors, including financial services and digital health companies. She counsels these companies — and their technology and advertising partners — on how to address legacy regulatory issues and the cutting edge issues that have emerged with industry innovations and data collaborations. 

Chambers USA 2024 ranks Libbie in Band 3 Nationwide for both Privacy & Data Security: Privacy and Privacy & Data Security: Healthcare. Chambers USA notes, Libbie is “incredibly sharp and really thorough. She can do the nitty-gritty, in-the-weeds legal work incredibly well but she also can think of a bigger-picture business context and help to think through practical solutions.”

Photo of Jayne Ponder Jayne Ponder

Jayne Ponder provides strategic advice to national and multinational companies across industries on existing and emerging data privacy, cybersecurity, and artificial intelligence laws and regulations.

Jayne’s practice focuses on helping clients launch and improve products and services that involve laws governing data privacy…

Jayne Ponder provides strategic advice to national and multinational companies across industries on existing and emerging data privacy, cybersecurity, and artificial intelligence laws and regulations.

Jayne’s practice focuses on helping clients launch and improve products and services that involve laws governing data privacy, artificial intelligence, sensitive data and biometrics, marketing and online advertising, connected devices, and social media. For example, Jayne regularly advises clients on the California Consumer Privacy Act, Colorado AI Act, and the developing patchwork of U.S. state data privacy and artificial intelligence laws. She advises clients on drafting consumer notices, designing consent flows and consumer choices, drafting and negotiating commercial terms, building consumer rights processes, and undertaking data protection impact assessments. In addition, she routinely partners with clients on the development of risk-based privacy and artificial intelligence governance programs that reflect the dynamic regulatory environment and incorporate practical mitigation measures.

Jayne routinely represents clients in enforcement actions brought by the Federal Trade Commission and state attorneys general, particularly in areas related to data privacy, artificial intelligence, advertising, and cybersecurity. Additionally, she helps clients to advance advocacy in rulemaking processes led by federal and state regulators on data privacy, cybersecurity, and artificial intelligence topics.

As part of her practice, Jayne also advises companies on cybersecurity incident preparedness and response, including by drafting, revising, and testing incident response plans, conducting cybersecurity gap assessments, engaging vendors, and analyzing obligations under breach notification laws following an incident.

Jayne maintains an active pro bono practice, including assisting small and nonprofit entities with data privacy topics and elder estate planning.

Photo of John Bowers John Bowers

John Bowers is an associate in the firm’s Washington, DC office. He is a member of the Data Privacy and Cybersecurity Practice Group and the Technology and Communications Regulation Practice Group.

John advises clients on a wide range of privacy and communications issues…

John Bowers is an associate in the firm’s Washington, DC office. He is a member of the Data Privacy and Cybersecurity Practice Group and the Technology and Communications Regulation Practice Group.

John advises clients on a wide range of privacy and communications issues, including compliance with telecommunications regulations and U.S. state and federal privacy laws.