On May 13, 2025, the European Commission issued its draft Guidelines on the protection of minors online under the DSA (“the Guidelines”).  The Guidelines aim to support providers of online platforms that are “accessible to minors” with meeting their obligation to ensure “a high level of privacy, safety, and security” for minors under Article 28(1) of the Digital Services Act (“DSA”).

Below we provide an overview of the Guidelines and key takeaways.

Scope

The Guidelines are addressed to providers of online platforms whose services are “accessible to minors”.  Recital 71 DSA clarifies that an online platform can be considered as “accessible to minors” when “its terms and conditions permit minors to use the service, when its service is directed at or predominantly used by minors, or where the provider is otherwise aware that some of the recipients of its service are minors”.

The Commission considers that platforms may be considered as “accessible to minors” in the following circumstances, among others:

  • A platform’s terms and conditions restrict access to minors, but the provider does not implement any effective measures to prevent access;
  • The provider already processes users’ personal data for purposes other than age verification, and that data reveals the user’s age; or
  • Types of platforms that are known to appeal to minors, offer similar services to those used by minors, or promote their services to minors.

While the Guidelines are not legally binding, once finalized, they will constitute a “significant and meaningful benchmark”, which the Commission will rely on in the context of its enforcement of the DSA. Nevertheless, the Commission has clarified that implementing the recommended measures, in part or in full, will not automatically amount to a presumption of compliance with Article 28(1) DSA. 

Risk Review

The Commission recommends that providers of platforms accessible to minors (hereinafter, “providers”) carry out a “risk review”.  This review aims to assess and determine what measures are most appropriate and proportionate to meet the provider’s obligations under Article 28(1) DSA. 

At a minimum, providers should identify and assess the following elements:

  • Likelihood of minors accessing their service;
  • The risks that the platform may pose to the privacy, safety and security of minors, based on the “5Cs typology of risks” developed by the OECD.  The aim is to identify certain types of risks which may infringe minors’ rights;
  • The measures already implemented to prevent and mitigate these risks, and their potential positive and negative effects on minors’ rights;
  • Any additional measures identified in the review.

With regards to providers of “very large online platforms” and “very large online search engines”, as defined by the DSA, the Commission clarifies that this risk review should be carried out as part of, and complement, the risk assessment required by Article 34 DSA.

Age Assurance Measures

Providers should conduct an assessment to determine whether and which age assurance method(s), i.e., age verification, age estimation, and self-declaration, are most suitable to address the risks that their service might pose to minors. 

The Commission considers that age verification would be necessary in the following scenarios:

  • Services restricted to 18+ (e.g., sale of alcohol, access to gambling or pornographic content);
  • Services designed for an adult-only audience (e.g., adult dating platforms);
  • Services whose terms and conditions require a minimum age of 18;
  • High-risk services, where less far-reaching measures are deemed insufficient.

In the Commission’s view, methods that rely on verified and trusted government-issued IDs may constitute an effective age verification method, such as the EU Digital Identity Wallet, which EU Member States are expected to make available to citizens, residents and businesses by the end of 2026.  In the meantime, the Commission is developing an “EU age verification solution” that can be used as a device-based age verification method.

Conversely, the Commission does not believe that “self-declaration of age” would be appropriate to meet the objectives of Article 28(1) DSA, due to its insufficient robustness and accuracy.

Recommended Measures

In the following sections of the Guidelines, the Commission lists a range of detailed measures and steps that providers should take, as appropriate, to meet their obligations under Article 28(1) DSA.  Among others, the recommendations cover the following areas:

  • Registration: features and practices to be implemented in the user registration process;
  • Account settings: settings, features and functionalities to be turned on or off by default;
  • Online interface design: functionalities, tools, and options that allow minors to decide how they engage with the service;
  • Recommender systems and search features: parameters, factors and input to consider and/or evaluate as part of the design and operation of recommender systems, as well as their testing and adaptation;
  • Commercial practices: recommended practices, transparency recommendations, and information to be provided to minors in the context of economic transactions and advertising;
  • Content moderation: transparency disclosures relating to content moderation affecting minors, and strategies to prioritize moderation, technical measures and features to minimize risk of showcasing harmful content to minors;
  • Reporting, user support and tools for guardians: recommendations to adapt reporting, feedback and complaint mechanisms for minors, tools and controls for guardians, support resources and appropriate warnings;
  • Governance: internal policies to ensure a high level of privacy, safety and security of minors on a service, assignment of dedicated human resources, and training of relevant staff;
  • Terms and conditions: transparency disclosures explaining the measures deployed to protect minors and how they work, as well as guidance on presenting child-friendly, age-appropriate, and easily understandable information.

Next Steps

The Guidelines are open for public consultation until June 10, 2025.  After that, the Commission will review the feedback it receives from stakeholders and consider whether any changes or additions to the Guidelines are necessary.  

***

Covington’s Data Privacy and Cybersecurity Team regularly advises clients on their most challenging regulatory and compliance issues in the EU, including the Digital Services Act and other emerging tech regulation.  If you have questions about these Guidelines, or the Digital Services Act, we are happy to assist with any queries.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as the IAPP’s European Advisory Board, Privacy International and the European security agency, ENISA.

Photo of Laura Somaini Laura Somaini

Laura Somaini is an associate in the Data Privacy and Cybersecurity Practice Group.

Laura advises clients on EU data protection, e-privacy and technology law, including on Italian requirements. She regularly assists clients in relation to GDPR compliance, international data transfers, direct marketing rules…

Laura Somaini is an associate in the Data Privacy and Cybersecurity Practice Group.

Laura advises clients on EU data protection, e-privacy and technology law, including on Italian requirements. She regularly assists clients in relation to GDPR compliance, international data transfers, direct marketing rules as well as data protection contracts and policies.