On February 13, 2026, France’s highest administrative court (“Conseil d’État”) delivered an important decision clarifying the boundary between pseudonymization and anonymization under the GDPR. The ruling confirms that data which remain re‑identifiable in practice—even with some effort—must be treated as personal data under the GDPR by service providers, unless the risk of re‑identification by such providers can genuinely be regarded as insignificant.

Background of the Case

The case arose from enforcement actions brought by the French data protection authority (“CNIL”) against IT service providers active in the healthcare sector. The proceedings concerned the large‑scale processing of health‑related data collected from physicians and pharmacies, which the companies argued had been anonymized prior to use. Following investigations, the CNIL found that the datasets at issue remained personal data within the meaning of the GDPR and imposed administrative fines for multiple infringements.

The companies challenged the CNIL’s decisions before the Conseil d’État. They sought annulment of the fines, arguing that the data no longer related to identifiable individuals and therefore fell outside the scope of the GDPR. In the alternative, they requested that a preliminary question be referred to the Court of Justice of the European Union (“CJEU”) regarding the criteria for anonymization under EU law.

The Conseil d’Etat’s Reasoning on Anonymization

In assessing whether the datasets at issue were truly anonymized, the Conseil d’État focused on the concrete risk of re‑identification, rather than on the formal application of pseudonymization techniques.

While the companies argued that the data concerned only contained patient or client codes, the Conseil d’État highlighted that the datasets also included highly granular individual and medical information, such as age, sex, pathologies, and prescribed or purchased treatments. This information was combined with precise temporal data, including the date—and sometimes the exact time—of medical consultations or pharmacy purchases.

Crucially, the Conseil d’État noted that the datasets also contained elements enabling the identification or localization of healthcare professionals. In particular, for pharmacy data, the collection of prescriber identifiers allowed healthcare professionals to be identified through a simple search using publicly accessible online resources. The Conseil d’État also observed that regional codes had been collected until 2022, further increasing the granularity of the data.

Taken together, these elements made it possible to reconstruct care pathways and individualize patients and their medical conditions. The Conseil d’État endorsed the CNIL’s finding that such re‑identification could be achieved using limited time and ordinary means, without sophisticated tools. In practice, commonly available spreadsheet software, combined with the nomenclatures provided by the companies themselves, was sufficient to associate alphanumeric codes with specific patients and medical acts.

The Conseil d’État further highlighted that the risk of re‑identification was particularly high in certain cases, notably where treatments were rare. That risk could be amplified by the use of additional data already held by the companies, or by combining the datasets with third‑party information, such as geolocation data.

Finally, the Conseil d’État made clear that it was irrelevant whether the companies themselves carried out any re‑identification or inference. What mattered was the objective possibility of identification based on the data and reasonably available means.

On this basis, the Conseil d’État concluded that the CNIL had conducted a concrete and contextual assessment of the re‑identification risk and had correctly found that the pseudonymization could be reversed by reasonable means. The data therefore could not be regarded as anonymized.  A referral to the CJEU was considered unnecessary.

*                                  *                                  *

Covington’s Data Privacy and Cybersecurity team regularly advises companies on their most challenging data protection and compliance issues in the UK, EU and other key markets. If you have any questions about the topics discussed in this article, please do not hesitate to contact us.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Alix Bertrand Alix Bertrand

Alix advises clients on EU data protection and technology law, with a particular focus on French privacy and data protection requirements. She regularly assists clients in relation to international data transfers, direct marketing rules as well as IT and data protection contracts. Alix…

Alix advises clients on EU data protection and technology law, with a particular focus on French privacy and data protection requirements. She regularly assists clients in relation to international data transfers, direct marketing rules as well as IT and data protection contracts. Alix is a member of the Paris and Brussels Bars.

Photo of Kristof Van Quathem Kristof Van Quathem

Kristof Van Quathem advises clients on information technology matters and policy, with a focus on data protection, cybercrime and various EU data-related initiatives, such as the Data Act, the AI Act and EHDS.

Kristof has been specializing in this area for over twenty…

Kristof Van Quathem advises clients on information technology matters and policy, with a focus on data protection, cybercrime and various EU data-related initiatives, such as the Data Act, the AI Act and EHDS.

Kristof has been specializing in this area for over twenty years and developed particular experience in the life science and information technology sectors. He counsels clients on government affairs strategies concerning EU lawmaking and their compliance with applicable regulatory frameworks, and has represented clients in non-contentious and contentious matters before data protection authorities, national courts and the Court of the Justice of the EU.

Kristof is admitted to practice in Belgium.