On Wednesday, the Federal Trade Commission (“FTC”) hosted a virtual event on “Protecting Kids from Stealth Advertising in Digital Media.”  The event featured industry professionals, legal and child development experts, researchers, and consumer advocates to discuss the regulation of digital advertising to children.  Panelists examined the online advertising techniques children are exposed to, children’s capacity to understand and recognize advertising, and the potential harms associated with advertising in an ever-evolving digital landscape.   

Continue Reading FTC Hosts Event Regarding Children’s Experiences with Digital Advertising

As we previously discussed, the California Privacy Protection Agency (“CPPA”) recently released updated rules implementing the California Privacy Rights Act (“CPRA”). Here are some of the key changes from those rules.  While the changes are modest, they are directionally helpful in addressing some of the concerns industry raised during the rulemaking process.

Continue Reading Some Key Takeaways from The Updated CPRA Rules

The California Privacy Protection Agency (CPPA) staff has posted updated draft rules implementing the California Privacy Rights Act (CPRA).  The CPPA Board will discuss the updated draft rules during two virtual public meetings on Friday, October 21 and Saturday, October 22.  Agency staff and counsel will also be present at these meetings, which could follow the process proposed by the CPPA Rulemaking Process Subcommittee during its last board meeting.  The updated draft rules, which retain some of the more controversial provisions from the previous draft such as the rules on opt-out preference signals, are available here.  An explanation of the proposed changes can be found here, including certain staff-recommended items for discussion during the board meetings.

Many employers and employment agencies have turned to artificial intelligence (“AI”) tools to assist them in making better and faster employment decisions, including in the hiring and promotion processes.  The use of AI for these purposes has been scrutinized and will now be regulated in New York City.  The New York City Department of Consumer and Worker Protection (“DCWP”) recently issued a Notice of Public Hearing and Opportunity to Comment on Proposed Rules relating to the implementation of New York City’s law regulating the use of automated employment decision tools (“AEDT”) by NYC employers and employment agencies.  As detailed further below, the comment period is open until October 24, 2022.

Continue Reading Artificial Intelligence & NYC Employers:  New York City Seeks Public Comment on Proposed Rules That Would Regulate the Use of AI Tools in the Employment Context

On October 7, 2022, President Biden signed an Executive Order directing the steps that the United States will take to implement its commitments under the new EU-U.S. Data Privacy Framework.  The framework was announced by the U.S. and the EU Commission in March 2022, after reaching a political agreement in principle (see our blog post here).

The Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities is intended to address the concerns raised by the Court of Justice of the EU (“CJEU”) in its Schrems II judgment on July 16, 2020, which annulled the EU-U.S. Privacy Shield (see our blog post here).  There, the CJEU held that the U.S. did not provide an “essentially equivalent” level of data protection to that found in the EU, due in part to extensive powers granted to U.S. law enforcement and intelligence agencies to access individuals’ personal data, and an absence of effective legal remedies for EU residents in connection with those powers.  The CJEU focused on two U.S. authorities in particular:  FISA Section 702 and Executive Order 12333.

To address these concerns, the new Executive Order sets forth certain “privacy and civil liberties safeguards” for U.S. signals intelligence activities and creates a new method of redress for non-U.S. persons from “qualifying states.”  In particular, and among other provisions, the Executive Order:

  • Provides that U.S. signals intelligence activities shall be “necessary” and “proportionate” to a “validated intelligence priority.”  The Executive Order provides that U.S. signals intelligence activities may only be conducted following a determination that they are “necessary to advance a validated intelligence priority,” and “only to the extent and in a manner that is proportionate to the validated intelligence priority for which they have been authorized.”  Exec. Order § 2(a)(ii)(A).  The Executive Order also specifies certain “legitimate objectives” and “prohibited objectives” for which U.S. signals intelligence activities may be carried out.  Exec. Order § 2(b).  For example, the Executive Order defines “legitimate objectives” to include understanding or assessing the capabilities, intentions, or activities of foreign organizations that pose a current or potential threat to the national security of the U.S. or its allies or partners; protecting against terrorism, the taking of hostages, and the holding of individuals captive conducted by or on behalf of a foreign government, foreign organization, or foreign person; and understanding or assessing transnational threats that impact global security.  Exec. Order § 2(b)(i).
  • Sets forth requirements for the handling of personal information collected through signals intelligence.  Each element of the U.S. Intelligence Community that handles personal information collected through signals intelligence must establish policies and procedures to minimize the dissemination and retention of personal information.  Exec. Order § 2(c)(iii)(A).  For example, under the Executive Order, the U.S. Intelligence Community may not disseminate personal information collected through signals intelligence solely because of a person’s nationality or country of residence, and shall retain non-U.S. persons’ personal information “only if the retention of comparable information concerning United States persons would be permitted under applicable law.”  Exec. Order § 2(c)(iii)(A).  The Executive Order further provides that each element of the Intelligence Community must maintain appropriate training requirements to ensure that employees with access to signals intelligence know and understand the requirements of the Order.  Exec. Order § 2(d)(ii).  The Executive Order encourages the Privacy and Civil Liberties Oversight Board to review the U.S. Intelligence Community’s updated policies and procedures to ensure that they are “consistent with the enhanced safeguards” contained in the Order.  Exec. Order § 2(c)(v).
  • Establishes a mechanism for non-U.S. persons to seek review of the U.S. Intelligence Community’s signals intelligence activities.  Within sixty days of the Executive Order’s issuance, the Director of National Intelligence (“DNI”), in consultation with the U.S. Attorney General and the heads of elements of the U.S. Intelligence Community, shall establish a process for the submission of “qualifying complaints transmitted by the appropriate public authority in a qualifying state.”  Exec. Order § 3(b).  To implement this redress mechanism, the Attorney General may designate a country or regional economic integration organization a “qualifying state” based on a determination, in consultation with the Secretary of State, the Secretary of Commerce, and the DNI, that the country’s or organization’s laws establish “appropriate safeguards” for U.S. persons’ personal information that is transferred from the United States.  Exec. Order § 3(f).  The DNI’s Civil Liberties Protection Officer (“CLPO”) will investigate, review, and, as necessary, order “appropriate remediation” for complaints from qualifying states.  Exec. Order § 3(c).  “Appropriate remediation” may include, depending on the specific covered violation at issue, terminating acquisition of data where collection is not lawfully authorized, deleting data that had been acquired without lawful authorization, or restricting access to lawfully collected data to those appropriately trained.  Exec. Order § 4(a).
  • Creates a Data Protection Review Court to review the CLPO’s determination regarding qualifying complaints.  The Attorney General, in consultation with the Secretary of Commerce, the DNI, and the Privacy and Civil Liberties Oversight Board, shall appoint judges to serve on a newly created Data Protection Review Court, who will be legal practitioners with appropriate experience in the fields of data privacy and national security law, and who may not be U.S. government employees.  Exec. Order § 3(d).  Following the CLPO’s determination, the complainant (or, in the event of an adverse decision against the U.S. government, an element of the U.S. Intelligence Community) may apply to the Data Protection Review Court for review of the CLPO’s decision.  Exec. Order § 3(c)(i)(E).  Upon receipt of an application for review, a three-judge panel of the Data Protection Review Court will convene to review the application and select a special advocate to assist with the review, including by advocating the complainant’s interest in the matter.  Exec. Order § 3(d)(i)(B)‑(C).  The Data Protection Review Court’s determination shall be binding on the U.S. Intelligence Community.  Exec. Order. § 3(d)(ii).

The European Commission will now review the Executive Order and commence drafting a new adequacy decision pursuant to Article 45 of GDPR.  The European Commission must then hear from the European Data Protection Board (“EDPB”) and the EU Member States.  The formal adoption process is expected to take around six months, and may result in the final adequacy decision’s publication in March 2023.

Once adopted, any new framework is certain to be pressure-tested before the EU courts.  To date, a number of privacy advocacy groups have issued statements opining that the new Executive Order is insufficient.

***

The Covington team will keep monitoring any developments on the EU-U.S. Data Privacy Framework and continue to report on them on our blog Inside Privacy.

On October 10, 2022 the draft rules implementing the Colorado Privacy Act (“CPA”) were officially published in the Colorado Register.  Written comments on the draft rules are due by November 7, 2022.  The CPA draft rules share some similarities with the draft rules set forth by the California Privacy Protection Agency (“CPPA”) interpreting the California Privacy Rights Act (“CPRA”).  Both sets of draft rules address requirements for privacy policy disclosures, consumer rights requests, and providing opt-out mechanisms.  However, there are a number of key differences between the two drafts. We highlight some of these below.

Continue Reading Colorado Attorney General Releases Draft CPA Rules

On September 28, 2022, the European Commission published its long-promised proposal for an AI Liability Directive.  The draft Directive is intended to complement the EU AI Act, which the EU’s institutions are still negotiating.  In parallel, the European Commission also published its proposal to update the EU’s 1985 Product Liability Directive.  If adopted, the proposals will change the liability rules for software and AI systems in the EU.

The draft AI Liability Directive establishes rules applicable to non-contractual, fault-based civil claims involving AI systems.  Specifically, the proposal establishes rules that would govern the preservation and disclosure of evidence in cases involving high-risk AI, as well as rules on the burden of proof and corresponding rebuttable presumptions.  If adopted as proposed, the draft AI Liability Directive will apply to damages that occur two years or more after the Directive enters into force; five years after its entry into force, the Commission will consider the need for rules on no-fault liability for AI claims.

As for the draft Directive on Liability of Defective Products, if adopted, EU Member States will have one year from its entry into force to implement it in their national laws.  The draft Directive would apply to products placed on the market one year after it enters into force.

Continue Reading European Commission Publishes Directive on the Liability of Artificial Intelligence Systems

On September 16, 2022, the European Commission published its Proposal for a European Media Freedom Act (“Proposed MFA”). The Proposed MFA is broadly designed to protect media pluralism and independence in the EU. It does so by setting a common set of rules “for all EU media players,” in particular, providers of “media services.” The Proposed MFA also imposes new obligations on providers of “very large online platforms” (“VLOPs”) as defined in the EU’s Digital Services Act (“DSA”).

Continue Reading European Commission publishes its Proposal for a European Media Freedom Act

On October 4, 2022, the EU adopted the Digital Services Act (“DSA”), which imposes new rules on providers of intermediary services (e.g., cloud services, file-sharing services, search engines, social networks and online marketplaces).  The DSA will enter into force on November 16, 2022 — although it will only fully apply as of February 17, 2024. 

Continue Reading EU Adopts Digital Services Act

This post is the first of a series of blog posts about the Digital Markets Act (“DMA”), which was adopted on July 18, 2022, and it deals specifically with those provisions of the DMA that are relevant to organizations’ privacy programs.

Continue Reading The Digital Markets Act for Privacy Professionals