The UK Information Commissioner’s Office (“ICO”) recently published detailed draft guidance on what “likely to be accessed” by children means in the context of its Age-Appropriate Design Code (“Code”), which came into force on September 2, 2020. The Code applies to online services “likely to be accessed by children” in the UK. “Children” are individuals under the age of 18. In order to determine whether an online service is “likely to be accessed” by children, companies must assess whether the nature and content of the service has “particular appeal for children” and “the way in which the service was accessed”. This new draft guidance provides further assistance on how to make this assessment, and is undergoing a public consultation until May 19, 2023.

The draft guidance reiterates that the Code applies both to:

  • services that are not intended for use by under-18s, but are nonetheless likely to be used by them; and
  • services that are aimed at children.

The draft guidance focuses on the measures that “adult-only” services (i.e., those not intended for use by under-18s) could take to ensure they are not likely to be accessed by under-18s, taking them out of scope of the Code.

In particular, it clarifies:

  • Adult-only services must look at the available evidence to determine whether it is likely to be accessed by children. The ICO provides a non-exhaustive list of factors that providers must take into account, along with some detailed use cases explaining how they might apply. The factors to be considered include:
    • information about the actual ages of users, which the ICO suggests could come directly from those users or through other sources, including “age-profiling tools”;research (internal or external) about the actual or likely ages of users of similar services. For example, the ICO states that research showing that under-18s play one type of video game could indicate that under-18s will also be likely to access similar games. The guidance also provides the example of online dating sites, suggesting that market research might indicate that there are a significant number of active under-18 users;the prevalence of advertising aimed at children on the service;the content available on the site, and whether that content appeals to children. The guidance uses the example of a social media platform hosting manga cartoons, images with bright colors, emojis, and live video-game streaming, which may be more appealing to children; and
    • how the service is marketed.

This assessment should be kept under review, and providers must comply with the Code if it becomes clear that “a significant number of children are in fact accessing [the] service”.

  • Services that have age-gating pages that prevent under-18s from accessing the service are not covered by the Code, provided that they are effective. However, the ICO notes that a simple self-declaration of age is “unlikely” to be an effective way of restricting access to over-18s. The guidance does not provide detailed guidance on what would be “effective” age-gating, merely giving the example of a website offering adult content that uses “robust age assurance methods through several third-party technological solutions”. See our previous blog on EU and UK developments on age verification here.
  • Providers should document their decisions about whether their services are likely to be accessed by children, and what steps they have chosen to take as a result of that decision (e.g., whether and how to age-gate the service, or how it will comply with the Code).

***

Covington’s Data Privacy and Cybersecurity Team will continue to monitor these developments.  Our team is happy to assist with any inquiries relating to age verification and children’s privacy.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as Privacy International and the European security agency, ENISA.

Photo of Paul Maynard Paul Maynard

Paul Maynard is an associate in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online…

Paul Maynard is an associate in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online platforms. He also advises clients on how to respond to law enforcement demands, particularly where such demands are made across borders.

Paul advises emerging and established companies in various sectors, including online retail, software and education technology. His practice covers advice on new legislative proposals, for example on e-privacy and cross-border law enforcement access to data; advice on existing but rapidly-changing rules, such the GDPR and cross-border data transfer rules; and on regulatory investigations in cases of alleged non-compliance, including in relation to online advertising and cybersecurity.

Photo of Sam Jungyun Choi Sam Jungyun Choi

Sam Jungyun Choi is an associate in the technology regulatory group in the London office. Her practice focuses on European data protection law and new policies and legislation relating to innovative technologies such as artificial intelligence, online platforms, digital health products and autonomous…

Sam Jungyun Choi is an associate in the technology regulatory group in the London office. Her practice focuses on European data protection law and new policies and legislation relating to innovative technologies such as artificial intelligence, online platforms, digital health products and autonomous vehicles. She also advises clients on matters relating to children’s privacy and policy initiatives relating to online safety.

Sam advises leading technology, software and life sciences companies on a wide range of matters relating to data protection and cybersecurity issues. Her work in this area has involved advising global companies on compliance with European data protection legislation, such as the General Data Protection Regulation (GDPR), the UK Data Protection Act, the ePrivacy Directive, and related EU and global legislation. She also advises on a variety of policy developments in Europe, including providing strategic advice on EU and national initiatives relating to artificial intelligence, data sharing, digital health, and online platforms.