Regulators in Europe and beyond have been ramping up their efforts related to online safety for minors, through new legislation, guidance, and by promoting self-regulatory tools. We discuss below recent developments in the EU and UK on age verification online.
Some potential methods of verifying users’ age online include self-declaration, parental consent, use of biometrics (including facial recognition technologies to ascertain physical features of the user’s face, or to check their correspondence with an ID picture), analysis of online usage patterns, and reliance on digital ID verification systems offered by some governments. However, neither regulators nor industry have so far come up with a single standard solution.
At the EU level, there is not yet a common stance on the best way to verify users’ age online, nor is there likely to be any time soon as the methods selected are to be calibrated to risk. There is, however, an increasing body of EU law requiring organizations to adopt some appropriate measures to verify age. For instance, the Audiovisual Media Services Directive requires the adoption of appropriate measures to protect children from harmful content, including through age verification. The GDPR contains rules on obtaining parental consent where consent is relied on as the legal basis for processing children’s personal data in the context of providing online services. The recently adopted Digital Services Act (“DSA”) also contains rules on protecting children online – including by not serving them targeted advertising based on profiling. (For more information on the DSA, see our previous blogpost here).
In May 2022, the European Commission announced its Strategy for a better internet for kids (BIK+), listing age verification as a priority. To this end, the Commission proposes to:
- develop a EU code for age-appropriate design by 2024, building on the framework of the DSA; and
- establish a European standard on online age verification in the context of the European Digital Identity (“eID”) proposal. The eID proposal would also enable minors to use their digital identity wallet to prove their age without disclosing other personal data.
In addition, the European Data Protection Board (“EDPB”) is expected to issue guidelines on children’s data and on the use of technologies for detecting and reporting online child sexual abuse, as announced in its 2023/2024 Work Programme (see our previous blogpost here).
National-level developments on age verification
National-level guidance on children’s privacy – such as those issued by the French and UK data protection authorities – have stressed age verification as an important safeguard. Below, we set out some recent developments relating to age verification at a national level.
The French Parliament is currently examining a legislative proposal to establish an age of “digital consent”. The current text of the proposal would require social network providers to implement certified technical solutions to verify users’ age and parental consent. The certifying authority would be the newly created ARCOM (Autorité de régulation de la communication audiovisuelle et numérique), which has competence over the audiovisual and digital communications sectors. ARCOM is expected to create a repository of tools, in consultation with the French data protection authority, the CNIL. The proposal has been adopted by the French National Assembly, and is currently being discussed within the Senate.
Moreover, in a statement of February 2023, the CNIL welcomed the launch of an experimental solution, in order to block access to certain websites and services reserved to adults, that complies with the GDPR and its 2022 recommendations.
To date, the Italian Garante has not issued specific guidance on age verification, but is increasingly active in campaigns on children’s privacy more broadly. In a statement, the Garante stated that age verification systems are “indispensable” to protect children.
In a recent investigation in 2023, the Garante ordered the temporary limitation of processing by a company operating an AI-powered chatbot. The Garante found that the chatbot’s inappropriate content posed concrete risks for minors and vulnerable subjects, and that the company had not established any age verification procedures, nor any mechanisms to ban or block access, even after the user explicitly declares that they are underage. The Garante did not, however, recommend a specific, effective method of age verification in its decision.
In October 2021, the UK Information Commissioner’s Office (“ICO”) issued its opinion on Age Assurance for the Children’s Code, setting out guidance on the effectiveness and privacy concerns associated with various age assurance methods. The ICO calls for organizations to take a “risk-based” approach to age assurance, and stresses that any processing of personal data in connection with age assurance technologies should comply with applicable data protection law. The ICO recommends the use of appropriately certified suppliers – for example, providers approved by the Age Check Certification Scheme (“ACCS”), which checks that providers meet the current industry standard. In late 2022, the ICO announced that audits and investigations on the application of the Code were in progress.
Additionally, in February 2023, the ICO published its guidance for the video game industry on how to conform with the UK’s Age Appropriate Design Code when developing video games (see our previous blogpost here).
Given the increasing focus of regulators across Europe on protecting children online, we expect further guidance and concrete schemes being set up to help organizations verify the age of the users of their online services.
Covington’s Data Privacy and Cybersecurity Team will continue to monitor developments. Our team is happy to assist with any inquiries relating to age verification and children’s privacy.