On 31 May 2022, the Italian Parliament approved Law 62/2022, also known as the Sunshine Act, which entered into force on 26 June 2022. The new rules will become fully operational once the Ministry of Health sets up the public database where companies will have to disclose their data.  In practice, this means the new transparency system will not be enforceable before 2023. 

Prior to the approval of the Sunshine Act, only member companies of trade associations, such as Farmindustria or Confindustria Dispositivi Medici, were under the obligation to disclose the transfers of value made to healthcare professionals (“HCP”) and organizations (“HCO”).  While non-member companies had no corresponding obligation, many disclosed their transfers on a voluntary basis.  Under these industry codes, companies could disclose data on transfers in an aggregate form, rather than individually, in two circumstances:  (i) when collecting individual consent would not be possible or (ii) when the transfer concerns R&D expenses.  The Sunshine Act, however, does not contain these derogations and excludes the option of only publishing aggregated data entirely. 

Companies in scope of the Sunshine Act will have to disclose their transfers of value on a dedicated online database, publicly accessible, that will be set up and managed by the Ministry of Health within 6 months following the entry into force of the Sunshine Act.  The database will be called “Sanità Trasparente” and will include data such as the professional contact details and number of affiliation of the HCPs, the contact details of the HCOs, and all the other details concerning the transfer of value.  The data stored on the database could be freely searched and sorted by the public for at least 5 years following publication. 

Within 3 months from the entry into force of the Sunshine Act, the Ministry of Health in collaboration with the Agency for Digital Italy (AgID), the National Anticorruption Authority (ANAC) and the Italian Data Protection Authority (Garante Privacy), will decide on the structure of the database, including its technical features and the procedure through which companies will disclose their data online.  The system should incorporate privacy by design and by default features.

Companies are required to disclose three distinct categories of data:

  1. Transfers of money, goods, services or other benefits made to HCPs or HCOs (“ToV”).
  2. Agreements with HCP and HCO providing them with direct or indirect benefits “consisting of participation in conferences, training events, committees, commissions, advisory bodies or scientific committees or the establishment of consulting, teaching or research relationships” (“Agreements”).
  3. The details of those HCPs and HCOs that (i) holds quotas, shares or bonds in the company (“Shares”), or (ii) received fees from the company for the economic exploitation of their intellectual property licenses (“Licenses”).

As anticipated, companies must disclose those data exclusively on an individual basis (i.e., per identified HCP/HCO).  To this end, the Sunshine Act establishes that privacy consent is considered provided at the moment when the HCP or HCO accepts the ToV or signs the Agreements or acquires the Shares or the Licenses.  This raises questions on the consistency of the provision with the GDPR, and, in particular, with the freely given nature of a consent and the right to withdraw consent.  The Sunshine Act clarifies that companies are under the obligation to inform HCPs or HCOs of the disclosure on the Ministry’s database Sanità Trasparente by providing them with a privacy notice that must clarify, at a minimum, that their data will be published. The Act also provides that the publication of the transfer of value is without prejudice to the rights of data subjects under Article 15-19 and 21 of the GDPR, which raises questions on the application of certain rights, such as the right of erasure.

* * *

The Covington team will keep monitoring the implementation of the Sunshine Act and the relevant database, and is happy to provide advice or answer any questions you may have on the topic.

On July 5, 2022, the European Parliament adopted the Digital Services Act (“DSA”) with 539 votes in favor, 54 votes against and 30 abstentions, following the political deal reached on April 23, 2022 (see our previous blog here).

Key aspects

The DSA is addressed to providers of intermediary services (e.g., Internet service providers, cloud providers, search engines, social networks and other online platforms, and online marketplaces).  The DSA will also apply to providers established outside the EU, to the extent they offer services to business and individual users established or located in the EU.

Among a range of topics, the DSA requires:

  • implementation of notice-and-action mechanisms;
  • setting up internal complaint-handling systems;
  • ensuring the traceability of traders on online marketplaces; and
  • compliance with detailed transparency and accountability obligations, including specifically on online advertising and algorithms used to recommend content. 

Moreover, the DSA imposes a ban on so-called dark patterns and online advertising activities targeting minors, or those based on sensitive personal data.

The strictest set of obligations are addressed to providers of “very large online platforms” and “very large online search engines”, i.e., those reaching an average of 45 million or more monthly active users in the EU, and designated as such by the Commission.  Specific obligations for such players include:

  • conducting assessments of “systemic risks” stemming from the design, functioning and use of their services, including algorithmic systems, in the EU;
  • conducting yearly independent audits;
  • granting access to data to the authorities, upon request, for the purposes of monitoring and assessing compliance with the DSA, and explaining the design, logic, functioning and the testing of algorithmic systems;
  • establishing an independent compliance function;
  • paying an annual supervisory fee to the Commission; and
  • complying with certain actions required by the Commission in cases of extraordinary circumstances leading to a serious threat to public security or public health.

Next steps

The DSA text must now be adopted by the Council (expected in September 2022).  The DSA will enter into force twenty days after publication in the EU Official Journal.

The DSA will be directly applicable across the EU and will apply fifteen months, or from January, 1 2024 (whichever comes later), after its entry into force.  However, the DSA will become enforceable sooner for very large online platforms and very large online search engines, i.e., four months after being designated as such by the Commission.


The Covington team will keep monitoring the developments on the DSA, and is happy to assist with any inquiries on the topic.

The UK Government recently published its long-awaited response to its data reform consultation, ‘Data: A new direction’ (see our post on the consultation, here).

As many readers are aware, following Brexit, the UK Government has to walk a fine line between trying to reduce the compliance burden on organizations and retaining the ‘adequacy’ status that the European Commission granted in 2021 (see our post on the decision, here).

While we’ll have to wait to review the detail of the final legislation, we outline below some of the more eye-catching proposals for reform.

Continue Reading 8 Eye-catching Reforms in the UK Government’s Response to its Public Consultation on Data Protection Law

On June 30, 2022, the European Data Protection Board published draft guidelines on certification as a tool for transfers.  These guidelines complement the EDPB’s earlier guidelines on certification and identifying certification criteria.

These guidelines and the guidelines on codes of conduct as tools for transfers appear to be part of the EDPB’s broader response to the Schrems II decision issued by the Court of Justice of the European Union (“CJEU”), which invalidated the EU-US Privacy Shield framework.  The approval of certification schemes expands the toolbox available under Art. 46 GDPR for lawfully transferring personal data outside the EEA.

Continue Reading European Data Protection Board Publishes Guidelines on Certification as a Tool for International Personal Data Transfers

On June 23, 2022, the German Federal Office for Information Security (“Office”) published technical guidelines on security requirements for healthcare apps, including mobile apps, web apps, and background systems.  Although the technical guidelines are aimed at healthcare app developers, they contain useful guidance for developers of any app that processes or stores sensitive data.

The guidelines set out a number of security levels and a security risk assessment.  The risk assessment takes into account the following aspects: (1) the apps’ purpose; (2) its architecture; (3) the source code; (4) third party software integrations; (5) cryptographic implementation; (6) authentication mechanisms; (7) data storage and protection; (8) auditing of paid resources; (9) network communication; (10) platform-specific interactions; and (11) resilience.  The guidelines also include specific security requirements for digital healthcare apps with biometric authentication mechanisms.

The guidelines are based on state-of-the-art security techniques used in the healthcare sector and the Office’s findings in several of its projects.  They also take into account feedback received from industry stakeholders, the German Federal Institute for Drugs and Medical Devices, and the German Federal Commissioner for Data Protection and Freedom of Information.

The Office offers a certification to healthcare apps that comply with the guidelines.

After more than seven months since China’s Personal Information Protection Law (《个人信息保护法》, “PIPL”) went into effect, Chinese regulators have issued several new (draft) rules over the past few days to implement the cross-border data transfer requirements of the PIPL.  In particular, Article 38 of the PIPL sets out three legal mechanisms for lawful transfers of personal information outside of China, namely: (i) successful completion of a government-led security assessment, (ii) obtaining certification under a government-authorized certification scheme, or (iii) implementing a standard contract with the party(-ies) outside of China receiving the data.  The most recent developments in relation to these mechanisms concern the standard contract and certification.

Chinese Government Issues Draft SCCs

On June 30, 2022, the Cyberspace Administration of China (“CAC”) released draft Provisions on the Standard Contract for the Cross-border Transfers of Personal Information (《个人信息出境标准合同规定(征求意见稿)》, “Draft Provisions”) for public consultation.  The full text of the Draft Provisions can be found here (currently available only in Mandarin Chinese).  The public consultation will end on July 29, 2022.

Three takeaways from China’s draft standard contract:

  1. The release of the Draft Provisions marks a major step towards implementing the legal mechanisms for cross-border data transfers under the PIPL.  However, only companies that meet certain thresholds can rely on the standard contract to transfer personal information overseas. 
  2. With the parties to the standard contract limited to a “personal information processing entity” (referenced hereinafter as “entity”, which is essentially equivalent to a “data controller” under the General Data Protection Regulation, “GDPR”) and the overseas data recipient, it seems that China’s standard contract could be applicable to (i) PRC controller to non-PRC controller, and (ii) PRC controller to non-PRC processor.  China-based entrusted parties (essentially equivalent to “data processors” under GDPR) appear to be unable to rely on this mechanism.
  3. The signed standard contract would need to be filed with Chinese government.  It is unclear whether any redaction would be allowed. 

Pursuant to Article 38 of the PIPL, the standard contract is one of the legal mechanisms that an entity may choose to implement to lawfully transfer personal information outside of China.

As set out in the Draft Provisions, a standard contract can be relied upon for cross-border transfers only if an entity can meet all of the following requirements:

  • it is not a Critical Information Infrastructure (CII) operator;
  • it processes the personal information of less than 1 million individuals;
  • it has transferred the personal information of less than 100,000 individuals on a cumulative basis since January 1 of the previous year; and
  • it has transferred the sensitive personal information of less than 10,000 individuals on a cumulative basis since January 1 of the previous year.

In other words, if an entity is required to undergo a CAC-led security assessment according to the draft Measures for the Security Assessment of Cross-border Data Transfers released by the CAC in October 2021, it will not be eligible to use the standard contract as a transfer mechanism.

Further, under the Draft Provisions, certain content needs to be specified in the standard contract, which is set out as a template standard contract attached to the Draft Provisions.  In an explanatory note, the CAC explains that the template contract is drafted based on the requirements set out in the Draft Provisions, and parties may negotiate additional provisions and attach them as an annex to the template contract.  It is unclear whether parties must use the template contract, and if so, whether parties may edit the terms in the main body of the template in addition to inserting additional terms in the template contract. 

Within 10 working days of the standard contract taking into effect, an entity that implements them is required to submit a file to the provincial branch of CAC containing: (1) the standard contract; and (2) a report that includes the personal information protection impact assessment conducted with respect to the transfer, which is required to be carried out before transferring personal information overseas.

China Releases Final Certification Guidelines for Cross-Border Data Transfers

On June 24, 2o22, following public consultation, China’s National Information Security Standardization Technical Committee (TC260) released the Practical Guidelines for Cybersecurity Standards –Specification for Security Certification of Cross-Border Processing of Personal Information (《网络安全标准实践指南—个人信息跨境处理活动安全认证规范》, “Certification Specification”), which takes effect immediately.  The full text can be found here (currently available only in Mandarin Chinese).

The Certification Specification is intended to provide a basis for the implementation of one of the personal information protection certification schemes under the PIPL, namely, the certification for processing activities involving certain cross-border data transfers. 

Under the Certification Specification, a certification can be obtained for the following cross-border processing activities:

  • cross-border processing of personal information among subsidiaries or affiliates of a multinational company or the same economic entity;
  • personal information processing activities covered by PIPL’s extraterritorial reach according to paragraph 2, Article 3 of the PIPL. 

An entity can apply for this certification when it complies with (i) the requirements under the GB/T 35273 Information Security Technology – Personal Information Security Specification (《信息安全技术 个人信息安全规范》), which is a non-binding but highly influential national standard issued pre-PIPL, for their in-country processing of data, and (ii) the requirements under the Certification Specification when it carries out cross-border processing activities. 

It is unclear at this stage whether the certification can be relied upon as one of the valid transfer mechanisms under Article 38 of the PIPL.  The language of the first draft seems to suggest that a certification obtained under the Certification Specification can be applied as such, but the reference to Article 38 was removed from the final draft.  Consequently, it is unclear whether the regulators still intend to acknowledge the certification scheme as a valid transfer mechanism, or not.

Finally, the Certification Specification is not a certification plan that lists all the relevant controls a certification body might review when a company applies for the certification.  Instead, it provides only a high-level description of the criteria that will likely be considered during the certification process.  Accordingly, and confusingly, the Certification Specification does not address a number of key issues typical of a certification of this sort, such as identifying qualified certification bodies or detailing how the certification process will be run by such certification bodies.

In many respects, the Certification Specification is comparable to the EU Binding Corporate Rules (“BCR”) under the GDPR.  For instance, both are intended for use by multinational companies and both set forth detailed information to be specified in a legally binding and enforceable agreement between/among the parties.  However, there are some noteworthy differences.  Notably, under the Chinese Certification Specification, the overseas recipient needs to promise to accept the supervision of the Chinese certification body and “accept the jurisdiction of the relevant Chinese laws and regulations on personal information protection”, while in the BCR, the EU party with delegated responsibilities commits to submit to the jurisdiction of the courts, or other competent authorities in the EU, in case of violation of the BCR by a non-EU party.

On June 23, 2022 the Italian data protection authority (“Garante”) released a general statement (here) flagging the unlawfulness of data transfers to the U.S. resulting from the use of Google Analytics.  The Garante invites all Italian website operators, both public and private, to verify that the use of cookies and other tracking tools on their websites is compliant with data protection law, in particular with regards to the use of Google Analytics and similar services. 

The Garante’s statement follows an order (here) issued against an Italian website operator to stop data transfers to Google LLC in the U.S., and joins other European data protection authorities in their actions relating to the use of Google Analytics (see our previous blogs here and here).

Below we summarize the Garante’s key considerations.

  • Google Analytics’ “IP Anonymization” feature

The Garante analyzes Google Analytics’ so-called “IP-Anonymization” feature, which allows the transfer of user IP addresses to Google Analytics after masking the IP address’ last octet.  The Garante finds that such feature constitutes a pseudonymization of the IP address, and not anonymization.  According to the Garante, the feature does not prevent Google LLC from re-identifying the user, given Google’s capabilities to enrich such data through additional information it holds, especially in circumstances where those users maintain and use a Google account.

  • Inadequacy of supplementary measures

After recalling the CJEU’s findings in Schrems II (see our previous blogs here and here), the Garante goes on to find a lack of adequate supplementary measures in place to protect data subjects’ personal data.  In particular, the Garante highlights that the Italian website operator had based its assessment of the transfer on certain subjective criteria, which it deems to be at odds with the recommendations of the EDPB (see our previous blog here).  The Garante finds that the encryption measures adopted by Google LLC cannot be considered sufficient, so long as the key remains available to the data importer, and recalls the EDPB’s recommendation that contractual and organizational measures are not sufficient in themselves to prevent access to transferred data, in the absence of further technical measures.  The Garante does not clarify what, in its opinion, would constitute appropriate technical measures, but provides that these must be set out by taking into account the EDPB guidance in this area.

The Garante also restates that a data exporter is responsible for implementing appropriate and effective measures under the GDPR and for demonstrating compliance, rejecting the website operator’s argument that it had no capacity, including any bargaining power over Google LLC, to influence the measures applied to the transferred data.

  • The outcome

The Garante ultimately finds that transfers of personal data to the U.S., as a result of the use of Google Analytics, are unlawful.  It orders the website operator to suspend data transfers, and to bring its processing into compliance within 90 days. 

The Garante did not impose a fine, as it considered that (i) the relevant data did not include special categories of personal data, (ii) the website operator had incorrectly assumed that the supplementary measures adopted by Google were appropriate, without having any decision-making power in that respect, (iii) the website operator adopted remedial measures to mitigate the damage to data subjects, and (iv) the website operator cooperated with the Garante in the course of the proceedings.


The Covington team will keep monitoring the developments on enforcement cases relating to the CJEU’s Schrems II judgement and Google Analytics, and is happy to assist with any inquiries on the topic.

On June 21, 2022, the Court of Justice of the EU (“CJEU”) decided that that the Passenger Name Record (“PNR”) Directive’s provisions providing for  the processing of PNR data by competent Member State authorities are compatible with the EU Charter of Fundamental Rights (“Charter”).  However, the CJEU also decided that the PNR Directive limits the way in which Member State laws transpose some of its provisions, particularly in relation to the collection of passenger information for intra-EU flights.  Its decision will require Belgium to amend its law transposing the PNR Directive, mainly in relation to the PNR data competent authorities may receive and how they can process this data.  It is likely to indirectly impact air carriers and tour operators operating in Belgium, as it will reduce the amount of data they need to share with competent authorities under such a revised legal framework.

The CJEU decision also considers, as well, Member State laws transposing (1) the Council Directive 2004/82/EC on the obligation of carriers to communicate passenger data (API Directive) and (2) Directive 2010/65/EU on reporting formalities for ships arriving in and/or departing from ports of the Member States.

The case was lodged on October 31, 2019, by the non-profit organization Ligue des Droits Humainsbefore the Belgian courts in relation to the Belgian law transposing the PNR and API Directives.  The Belgian Constitutional Court referred certain questions to the CJEU.

Continue Reading Court of Justice of the EU Decides that the Passenger Name Record Directive is Compatible with EU Law

On June 23, Congressman Patrick McHenry released a discussion draft of new legislation to modernize federal financial data privacy law. The draft legislation would amend and build on the Gramm-Leach-Bliley Act (“GLBA”). The draft includes notable provisions on consumer rights, data minimization, and disclosures. It also updates the definition of “financial institution” to include data aggregators and limits the distinction between “consumers” and “customers” under the GLBA. Finally, the law would preempt state laws regulating the obligations of financial institutions with respect to areas covered under the law. Congressman McHenry is the ranking Republican on the House Financial Services Committee, and the draft legislation could provide a framework for further discussions on financial data privacy moving into the next Congress.

On June 14, 2022, representatives of the EU’s Consumer Protection Cooperation (CPC) Network, together with several national data protection authorities in the EU and the secretariat of the European Data Protection Board (“EDPB”), endorsed five key principles for fair advertising to children (see press release here).  These recommendations are based on relevant requirements in EU data and consumer protection laws.

According to the authorities, this joint initiative arises from the proliferation of digital business models that increasingly rely on the use of personal data for commercial purposes, which may be subject to specific rules under both data privacy and consumer protection legislation in Europe. 

In their joint statement, the authorities cited research indicating that children (defined as any individual below the age of 18 years old) are unable able to recognize certain forms of advertising — particularly ads that are deeply embedded in the context of digital media and online games — and as a result, they are particularly susceptible to certain forms of advertising potentially inappropriate for children.  Therefore, the authorities published these key principles for businesses to apply in order to (1) avoid practices that can be harmful for children and (2) better inform children about when and how their data is used for advertising purposes.

The five advertising principles are:

  1. Take into account the specific vulnerabilities of children when designing advertising or marketing techniques that are likely to target children (in particular, do not deceive or unduly influence them, and consider whether certain types of personalized marketing are inappropriate for them altogether);
  2. Do not exploit the age or credulity of children when engaged in marketing;
  3. Explain to children, in a manner that is appropriate and clear to them, whenever general marketing content is addressed to them or is likely to be seen by them;
  4. Do not target, urge or prompt children to purchase in-app or in-game content, and games marketed “for free” should not require in-app or in-game purchases to continue playing them in a “satisfactory manner”; and
  5. Do not profile children for advertising purposes. 

The authorities emphasize that these five key principles are without prejudice to applicable EU laws, particularly in the areas of consumer protection and data privacy, including any applicable national implementing rules. 

These principles follow a wave of recent child-oriented standards published by European data protection authorities, including (among others) the UK ICO’s Age Appropriate Design Code (see our blog posts here and here), the Irish DPC’s Fundamentals for a Child-Oriented Approach to Data Processing (see our blog posts here, here and here), and the French CNIL’s Eight Recommendations for Protecting Children Online (see our blog post here). 

Moreover, the latest draft of the EU’s Digital Services Act, which has been provisionally agreed by the European Parliament and the Council, requires providers of digital services to implement specific safeguards for protecting children.  Among other things, it requires putting in place “appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service”.  It also prohibits providers from showing targeted advertising on their platforms using personal data of individuals who they are “aware with reasonable certainty” to be minors. 

These developments demonstrate the continued focus of European lawmakers and regulators on safeguarding the interests of children, and the importance of businesses staying apprised of these evolving rules and putting in place appropriate measures to ensure compliance.  

The Covington team will keep monitoring any developments in the area of children’s privacy and is happy to assist with any inquiries on the topic.