Recently, California Governor Gavin Newsom signed into law several privacy and related proposals, including new laws governing browser opt-out preference signals, social media account deletion, data brokers, reproductive and health services, age signals for app stores, social media “black box warning” labels for minors, and companion chatbots. This blog summarizes the statutes’ key takeaways.

  • Opt-Out Preference Signals: The California Opt Me Out Act (AB 566) will require businesses that develop or maintain browsers to include functionality configurable by a consumer that enables the browser to send an opt-out preference signal. Additionally, a business that develops or maintains a browser must make clear to a consumer in public disclosures how the opt-out preference signal works and the intended effect of the opt-out preference signal. The law states that a business that maintains or develops a browser that includes the opt-out preference signal shall not be liable for a violation of the title by a business that receives the opt-out preference signal. AB 566 will take effect January 1, 2027, and provides the California Privacy Protection Agency (“CPPA”) rulemaking authority.
  • Social Media Account Deletion: AB 656 will require social media platforms that generate more than $100M per year in gross revenues to provide a “clear and conspicuous” button to complete an account deletion request. “Social media platform” is defined by reference to Section 22675 of the California code as a “public or semipublic internet-based service or application that has users in California” and where (1) a “substantial function” of the service or application is to connect users to interact socially with each other and (2) allows users to construct a public or semipublic profile, populate a list of users with whom the individual shares a social connection, and create or post content viewable by other users. If verification is needed for the account deletion request, it must be provided in a cost-effective and easy-to-use manner through a preestablished two-factor authentication, email, text, telephone, or message means. 
  • Data Brokers: SB 361amends the California data broker registration law (the “Delete Act”) to require additional disclosures from brokers when they register with the CPPA. Specifically, data brokers registering with the CPPA will be required to provide certain new information, such as whether the data broker collects names, addresses, phone numbers, mobile advertising identifiers, precise geolocation, or status related to union membership, sexual orientation, and gender identity, among other topics. Additionally, data brokers will also be required to disclose whether they sold or shared consumers’ data to a foreign actor, to a federal or state government, to law enforcement, or to a developer of a generative AI (“GenAI”) system or model in the past year. A developer of a GenAI system is defined as a business, person, corporation, or similar entity that designs, codes, produces, or substantially modifies a GenAI system. A GenAI system is defined as an AI system that can generate derived synthetic content, including text, images, video, and audio, that emulates the structure and characteristics of the system’s training data. 
  • Reproductive and Health Services: AB 45 will amend existing law to provide additional privacy protections for persons seeking or providing reproductive and health services at a family planning center. AB 45 will prohibit the collection, use, disclosure, sharing, sale, or retention of personal information of any person physically located at or within 1,850 feet of a family planning center, except to perform a requested service or provide requested goods. Additionally, AB 45 will prohibit geofencing entities that provide in-person health services to, among other things, identify or track a person seeking, receiving or providing health care services or to send advertisements related to these sensitive locations or health services. In addition to civil penalties, the statute also provides a private right of action for certain violations.
  • Age Signals for App Stores: The Digital Age Assurance Act (AB 1043) will require an operating system provider to collect birth date or age from account holders at account setup. Operating system providers must use this age information to provide an age signal to application developers, who will be required to request that information when a user downloads and launches an application. The law applies broadly to operating systems on a computer, mobile device, or any other general purpose computing device. The law will take effect on January 1, 2027.
  • Social Media “Black Box” Warning Labels: The Social Media Warning Law (AB 56) will require covered platforms to display a mental health “black box warning” label to users under the age of 18 each calendar day that a user uses the covered platform, including when the user first accesses the platform, after three hours of cumulative use, and thereafter, once per hour of continued cumulative use. The platform will not be required to display the label if the platform has reasonably determined that the user is over 17. The law will take effect January 1, 2027.
  • Companion Chatbots: SB 243 will impose new requirements for operators of companion chatbot platforms. The law defines a “companion chatbot” as an “artificial intelligence (‘AI’) system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user’s social needs,” including sustaining a relationship across sessions or exhibiting anthropomorphic features. If a reasonable person would believe they are interacting with a human, then the operator must provide a clear and conspicuous notice indicating the chatbot is artificially generated. Additionally, operators must implement measures to prevent the companion chatbot from providing suicidal ideation, suicide, or self-harm content to the user. Beginning July 1, 2027, operators must annually report to the Office of Suicide Prevention their protocols to detect, remove, and respond to certain content, such as suicidal ideation and self-harm. There are additional requirements if the operator knows the user is a minor. The operator must (1) disclose to the minor user that they are interacting with artificial intelligence, (2) provide by default a notification every three hours to the minor user that reminds the minor user to take a break and that the chatbot is AI, and (3) institute “reasonable measures” to prevent the chatbot from sharing or encouraging the minor user to engage in sexually explicit content.
Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection…

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.

Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for connected devices, biometrics, online advertising, endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, e-mail marketing, disclosures of video viewing information, and new technologies.

Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based, global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.

Photo of Libbie Canter Libbie Canter

Libbie Canter represents a wide variety of multinational companies on managing privacy, cyber security, and artificial intelligence risks, including helping clients with their most complex privacy challenges and the development of governance frameworks and processes to comply with U.S. and global privacy laws.

Libbie Canter represents a wide variety of multinational companies on managing privacy, cyber security, and artificial intelligence risks, including helping clients with their most complex privacy challenges and the development of governance frameworks and processes to comply with U.S. and global privacy laws. She routinely supports clients on their efforts to launch new products and services involving emerging technologies, and she has assisted dozens of clients with their efforts to prepare for and comply with federal and state laws, including the California Consumer Privacy Act, the Colorado AI Act, and other state laws. As part of her practice, she also regularly represents clients in strategic transactions involving personal data, cybersecurity, and artificial intelligence risk and represents clients in enforcement and litigation postures.

Libbie represents clients across industries, but she also has deep expertise in advising clients in highly-regulated sectors, including financial services and digital health companies. She counsels these companies — and their technology and advertising partners — on how to address legacy regulatory issues and the cutting edge issues that have emerged with industry innovations and data collaborations.

Chambers USA 2025 ranks Libbie in Band 3 Nationwide for both Privacy & Data Security: Privacy and Privacy & Data Security: Healthcare. Chambers USA notes, Libbie is “incredibly sharp and really thorough. She can do the nitty-gritty, in-the-weeds legal work incredibly well but she also can think of a bigger-picture business context and help to think through practical solutions.”

Photo of Jayne Ponder Jayne Ponder

Jayne Ponder provides strategic advice to national and multinational companies across industries on existing and emerging data privacy, cybersecurity, and artificial intelligence laws and regulations.

Jayne’s practice focuses on helping clients launch and improve products and services that involve laws governing data privacy…

Jayne Ponder provides strategic advice to national and multinational companies across industries on existing and emerging data privacy, cybersecurity, and artificial intelligence laws and regulations.

Jayne’s practice focuses on helping clients launch and improve products and services that involve laws governing data privacy, artificial intelligence, sensitive data and biometrics, marketing and online advertising, connected devices, and social media. For example, Jayne regularly advises clients on the California Consumer Privacy Act, Colorado AI Act, and the developing patchwork of U.S. state data privacy and artificial intelligence laws. She advises clients on drafting consumer notices, designing consent flows and consumer choices, drafting and negotiating commercial terms, building consumer rights processes, and undertaking data protection impact assessments. In addition, she routinely partners with clients on the development of risk-based privacy and artificial intelligence governance programs that reflect the dynamic regulatory environment and incorporate practical mitigation measures.

Jayne routinely represents clients in enforcement actions brought by the Federal Trade Commission and state attorneys general, particularly in areas related to data privacy, artificial intelligence, advertising, and cybersecurity. Additionally, she helps clients to advance advocacy in rulemaking processes led by federal and state regulators on data privacy, cybersecurity, and artificial intelligence topics.

As part of her practice, Jayne also advises companies on cybersecurity incident preparedness and response, including by drafting, revising, and testing incident response plans, conducting cybersecurity gap assessments, engaging vendors, and analyzing obligations under breach notification laws following an incident.

Jayne maintains an active pro bono practice, including assisting small and nonprofit entities with data privacy topics and elder estate planning.

Photo of Jenna Zhang Jenna Zhang

Jenna Zhang advises clients across industries on data privacy, cybersecurity, and emerging technologies. 

Jenna partners with clients to ensure their compliance with the rapidly evolving federal and state privacy and cybersecurity laws. She supports clients in designing new products and services, drafting privacy…

Jenna Zhang advises clients across industries on data privacy, cybersecurity, and emerging technologies. 

Jenna partners with clients to ensure their compliance with the rapidly evolving federal and state privacy and cybersecurity laws. She supports clients in designing new products and services, drafting privacy notices and terms of use, responding to cyber and data security incidents, and evaluating privacy and cybersecurity risks in corporate transactions. In particular, she advises clients on substantive requirements relating to children’s and student privacy, including COPPA, FERPA, age-appropriate design code laws, and social media laws.

As part of her practice, Jenna regularly represents clients in data privacy investigations and enforcement actions brought by the Federal Trade Commission and state attorneys general. She also supports clients in proactive engagement with regulators and policymakers to ensure their perspectives are heard.

Jenna also maintains an active pro bono practice with a focus on supporting families in adoptions, guardianships, and immigration matters.

Photo of Ariel Dukes Ariel Dukes

Ariel Dukes is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity Practice Group.

Ariel counsels clients on data privacy, cybersecurity, and artificial intelligence. Her practice includes partnering with clients on compliance with comprehensive privacy…

Ariel Dukes is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity Practice Group.

Ariel counsels clients on data privacy, cybersecurity, and artificial intelligence. Her practice includes partnering with clients on compliance with comprehensive privacy laws, FTC and consumer protection laws and guidance, and laws governing the handling of health-related data. Additionally, Ariel routinely counsels clients on drafting and negotiating privacy terms with vendors and third parties, developing privacy notices and consent forms, and responding to regulatory inquiries regarding privacy and cybersecurity topics. Ariel also advises clients on trends in artificial intelligence regulations and helps design governance programs for the development and deployment of artificial intelligence technologies across a number of industries.

Photo of Bryan Ramirez Bryan Ramirez

Bryan Ramirez is an associate in the firm’s San Francisco office and is a member of the Data Privacy and Cybersecurity Practice Group. He advises clients on a range of regulatory and compliance issues, including compliance with state privacy laws. Bryan also maintains…

Bryan Ramirez is an associate in the firm’s San Francisco office and is a member of the Data Privacy and Cybersecurity Practice Group. He advises clients on a range of regulatory and compliance issues, including compliance with state privacy laws. Bryan also maintains an active pro bono practice.