On March 21, 2022, the European Data Protection Board (“EDPB”) published its draft Guidelines 3/2022 on Dark patterns in social media platform interfaces (hereafter “Guidelines”, available here), following the EDPB’s plenary session held on March 14, 2022.  The stated objective of the Guidelines is to provide practical guidance to both designers and users of social media platforms about how to identify and avoid so-called “dark patterns” in social media interfaces that would violate requirements set out in the EU’s General Data Protection Regulation (“GDPR”).  In this sense, the Guidelines serve both to instruct organizations on how to design of their platforms and user interfaces in a GDPR-compliant manner, as well as to educate users on how certain practices they are subject to could run contrary to the GDPR (which could, as a result, lead to an increase in GDPR complaints arising from such practices).  The Guidelines are currently subject to a 6-week period of public consultation, and interested parties are invited to submit feedback directly to the EDPB here (see “provide your feedback” button).

In this blog post, we summarize the Guidelines and identify key takeaways.  Notably, while the Guidelines are targeted to designers and users of social media platforms, they may offer helpful insights to organizations across other sectors seeking to comply with the GDPR, and in particular, its requirements with respect to fairness, transparency, data minimization, purpose limitation, facilitating personal data rights, and so forth.

Setting the Stage

At the outset of the Guidelines, the EDPB defines “dark patterns” as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data”.  The EDPB then provides a taxonomy of 6 defined categories of dark patterns, namely:

  1. Overloading – overwhelming users with a large quantity of requests, information, options, or possibilities to prompt them to share more data;
  2. Skipping – designing the interface or user experience in a way that causes users to forget (or fail to consider) all or certain data protection aspects of a decision;
  3. Stirring – appealing to the emotions of users or using visual nudges;
  4. Hindering – obstructing or blocking users from becoming informed about the use of their data or exercising control it by making certain actions hard or impossible to achieve;
  5. Fickle – designing the interface in an inconsistent and unclear manner which makes it difficult to navigate user controls or understand processing purposes; and finally,
  6. Left in the dark – designing the interface in a way to hide information or privacy controls, or to leave users uncertain about how their data is processed and the control they can exercise over it.

The EDPB notes that these six categories can also be thematically framed as “content-based patterns” (i.e., referring to the content of information presented to users, including the context, wording used, and informational components) or “interface-based patterns” (i.e., referring to the manner that content is displayed, navigated through, or interacted with by users, which can have a direct influence on the perception of dark patterns).

Beneath the six over-arching categories of dark patterns outlined above, the EDPB then identifies 15 specific dark pattern behaviors and considers how each them can manifest during the lifecycle of a social media user account, a continuum which the EDPB breaks down into the following 5 stages: (1) opening a social media account; (2) staying informed on social media; (3) staying protected on social media; (4) exercising personal data rights on social media; and (5) leaving a social media account.

Back to the Basics

Before delving into the five use cases within the social media user lifecycle, the EDPB underscores several fundamental GDPR principles for organizations and individuals to bear in mind whenever considering (and seeking to avoid) the use of dark patterns, namely:

  • Fairness and Transparency (Article 5(1)(a) GDPR): The EDPB emphasizes that the GDPR’s fairness principle “serves an umbrella function” to prevent personal data from being processed in a manner that is detrimental, discriminatory, misleading, or unexpected by the data subject.  Regarding transparency, and pursuant to Article 12 GDPR, information must be provided to data subjects in a “concise, transparent, intelligible and easily accessible form, using clear and plain language” – a key consideration for any examination of potential dark patterns.
  • Accountability (Article 5(2) GDPR): The EDPB says that the user interface/journey on a social media platform can itself be used as a way to demonstrate compliance with GDPR requirements vis-à-vis social media users.  Notably, the EDPB encourages operators of social media platforms to conduct qualitative and quantitative research (e.g., A/B testing, eye tracking, and user interviews), as well as recording screenshots of the interface, to show how they successfully fulfil the GDPR’s informational requirements.
  • Data Protection by Design and by Default (Article 25 GDPR): The EDPB notes that specific elements must be taken into account here to ensure effective privacy by design and by default, in particular: respecting the autonomy and reasonable expectations of data subjects, enabling their easy interaction and exercise of rights, fostering consumer choice and avoiding imbalances of power, and providing information in an objective and neutral way that does deceive, manipulate, or mislead data subjects.

Deep Dive into Dark Patterns in the Social Media User Lifecycle

The EDPB provides a deep-dive analysis of dark patterns across the five stages of a social media user’s account lifecycle, dividing each section into (a) a description of the relevant context; (b) a summary of the relevant legal provisions; (c) an examination of specific dark patterns (whether content-based or interface-based), with multiple examples for each; and (d) a list of best practices to help avoid dark patterns.

Below, we call out some noteworthy remarks of the EDPB in relation to these five stages of the social media life cycle.

  • Opening a social media account
    • Consent: The EDPB says that social media providers must pay special attention to ensure consent is distinguishable when requested at the sign-up stage; otherwise, if users are overwhelmed with so much information at this onboarding step that it motivates them not to read all of it, but then they are required to confirm they have read the entire privacy policy, “this can qualify as forced consent to special conditions named [in the privacy policy]” (if consent is relied upon).  Users of social media platforms must also be able to withdraw their consent in one click if a single click was the means by which the platform obtained their consent at the sign-up stage.
    • Data Minimization: The EDPB also zeroes in on the principle of data minimization, stating in no unclear terms that social media platforms should not seek to obtain more personal information than is objectively necessary for the purpose(s) pursued.  Here, the EDPB gives the example of a platform requesting a phone number for two-factor security authentication of an account, even though an email address could also be used for the same purpose (e.g., to prove a that particular user is in possession of the device used to log into the platform).
    • Examples of Dark Patterns: The EDPB highlights the use of various dark patterns at this stage of the life cycle, such as the use of “emotional steering” – i.e., the use of words or visuals to convey information to users in either (a) a highly positive outlook, making users feel good or safe, or (b) a highly negative one, making users feel anxious or guilty – as well as nudging users towards sharing more data as the default option.
  • Staying informed on social media
    • Transparency: The EDPB stresses that while the GDPR has prescriptive transparency requirements (see Articles 12-14 GDPR), “more information does not necessarily mean better information[,] [and] too much irrelevant or confusing information can obscure important content points or reduce the likelihood of finding them”. Thus, the EDPB advocates the use of layered privacy notices that strike the appropriate balance between comprehensive information and easy accessibility.  The EDPB contrasts this with the use of dark patterns such as conflicting information or information that lacks logical consistency, or that offers ambiguous wording and/or overloads users by making certain information difficult to find.  Here again, the EDPB advocates testing the use of layered privacy notices with users to get their feedback and ensure their comprehension of the relevant information.
    • Joint Controllership: The EDPB notes that the provisions in Article 26 GDPR requiring joint controllers to disclose to data subjects the essence of their joint controller arrangement imposes additional transparency obligations on joint controllers, which may include social media platforms in certain circumstances.
    • Communication of Data Breaches: The EDPB warns against providing non-specific or irrelevant information to data subjects about data breaches – for example, indicating that a processor’s data breach was not a security breach of the controller; suggesting that the severity of the breach hinges on the effects on the platform or its processor, as opposed the impact on data subjects; or not indicating which specific types of special category personal data were exposed in the breach.
  • Staying protected on social media
    • Consent: Here again, the EDPB focuses on the means provided for social media users to give or withdraw their consent for various processing purposes, such as targeted advertising (including where obligations in relation to the use of cookies and trackers apply under the ePrivacy Directive apply), and the types of dark pattern behaviors that can be used to impede the withdrawal of consent.
    • User Controls: The EDPB raises concerns about dark patterns in the presentation of user controls, particularly when users are provided too many options and menus (or sub-menus) to choose from, which may be unclear or logically incongruent, rather than offering a clear, centralized means to manage privacy choices.  Notably, the EDPB states that while “[t]here is no ‘one size fits all approach’ when it comes to the average number of steps [] for users of social media platforms to take when changing a setting…[t]he number of steps required should [] be as low as possible”.
  • Exercising personal data rights on social media
    • Here, the EDPB raises concerns around dark patterns such as leading users to “dead ends” by redirecting them to irrelevant pages, providing insufficient or ambiguous disclosures, putting users through a “privacy maze” in order to exercise personal data rights, not providing user-friendly ways to exercise rights (e.g., a direct link to download a copy of their data), and making certain steps longer than necessary (e.g., asking users if they are “sure” they wish to take a particular action).
  • Leaving a social media account
    • The EDPB says that if the exercise of the right of erasure is made difficult without a justifiable reason, then this constitutes a violation of the GDPR, and that a user’s request to delete their social media account must be understood as an implicit withdrawal of consent (if relied upon) for processing under Article 7(3) GDPR. The EDPB also raises concerns about dark patterns at this stage that lead users through a “privacy maze” in order to delete their account, seek to emotionally steer users to keep their account, providing ambiguous information or confusing options, or leading them to “dead ends” that interrupt the deletion process.

Conclusion

These Guidelines are part of a broader trend in Europe and across other jurisdictions whereby regulators and courts are holding organizations accountable that offer websites, platforms or other consumer-facing interfaces in a way that could be interpreted as deceiving, manipulating, or unduly influencing consumers toward less privacy-protective choices.  In this respect, as highlighted above, these Guidelines (and particularly the best practices identified by the EDPB) could be instructive for organizations outside the social media sector who wish to avoid practices that may be likely to attract scrutiny from European privacy authorities.  Moreover, as pointed out by the EDPB, dark patterns can also violate consumer protection laws, potentially subjecting parties guilty of these practices to a dual enforcement regime.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Nicholas Shepherd Nicholas Shepherd

Nicholas Shepherd is an associate in Covington’s Brussels office, where he is a member of the Data Privacy and Cybersecurity practice group, advising clients on compliance with all aspects of the European General Data Protection Regulation (GDPR), ePrivacy Directive, European direct marketing laws…

Nicholas Shepherd is an associate in Covington’s Brussels office, where he is a member of the Data Privacy and Cybersecurity practice group, advising clients on compliance with all aspects of the European General Data Protection Regulation (GDPR), ePrivacy Directive, European direct marketing laws, and other privacy and cybersecurity laws worldwide.  Nick counsels on topics that include adtech, anonymization, children’s privacy, cross-border transfer restrictions, and much more, providing advice tailored to product- and service-specific contexts to help clients apply a risk-based approach in addressing requirements related to transparency, consent, lawful processing, data sharing, and others.

A U.S.-trained and qualified lawyer registered on the B-List of the Brussels Bar, Nick leverages his multi-faceted legal background and international experience to provide clear and pragmatic advice to help organizations address their privacy compliance obligations across jurisdictions.

Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as Privacy International and the European security agency, ENISA.