On December 9, 2022, the European Commissioner for Justice and Consumer Protection, Didier Reynders, announced that the European Commission will focus its next 2023 mandate on regulating dark patterns, alongside transparency in the online advertising market and cookie fatigue. As part of this mandate, the EU’s Consumer Protection Cooperation (“CPC”) Network, conducted a sweep of 399 retail websites and apps for dark patterns, and found that nearly 40% of online shopping websites rely on manipulative practices to exploit consumers’ vulnerabilities or trick them.

In order to enforce these issues, the EU does not have a single legislation that regulates dark patterns, but there are multiple regulations that discuss dark patterns and that may be used as a tool to protect consumers from dark patterns. This includes the General Data Protection Regulation (“GDPR”), the Digital Services Act (“DSA”), the Digital Markets Act (“DMA”), and the Unfair Commercial Practices Directive (“UCPD”), as well as proposed regulations such as the AI Act and Data Act.

As a result, there are several regulations and guidelines that organizations must consider when assessing whether their practices may be deemed as a dark pattern. In this blog post, we will provide a snapshot of the current EU legislation that regulates dark patterns as well as upcoming legislative updates that will regulate dark patterns alongside the current legal framework.

Legal Framework on Dark Patterns

There isn’t a single definition of the term ‘dark patterns’, however, it touches upon manipulative or deceptive practices that causes consumers to do something that they did not intend or want to do, especially where this leads to a negative consequence. For example:

  • The European Data Protection Board (“EDPB”) defines ‘dark patterns’ as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions in regards to their personal data with the aim of influencing users’ behaviors”. The EDPB also defines 6 categories of dark patterns; (1) overloading, (2) skipping, (3) stirring, (4) hindering, (5) fickle, and (6) left in the dark, which is described in further detail in our blog post.
  • The proposed Data Act similarly describes ‘dark patterns’ as a “design technique or mechanism that push or deceive consumers into decisions that have negative consequences for them. These manipulative techniques can be used to persuade users, particularly vulnerable consumers, to engage in unwanted behaviours, and to deceive users by nudging them into decisions on data disclosure transactions or to unreasonably bias the decision-making of the users of the service, in a way that subverts and impairs their autonomy, decision-making and choice.

Despite the varying descriptions, the common features of a ‘dark pattern’ are the (i) manipulative or deceptive nature and the (ii) resulting negative or harmful outcome on the consumer.

This ‘dark patterns’ language is pervasive across EU legislation, and can be found within different rules, guidelines and principles. Therefore, when organizations seek to consider what a ‘dark pattern’ is and how this affects their practices, it is important to consider a multitude of regulations, for example:

  • GDPR and ePrivacy Directive. While the GDPR and the ePrivacy Directive do not explicitly mention dark patterns, they form part of the current legal framework that regulate dark patterns. For example, where organizations rely on consent as the legal basis for processing personal data under the GDPR or obtain consent for cookies or marketing communications under the ePrivacy Directive, it may be possible that they engaged in dark patterns when collecting such consent.
    • EDPB Guidelines 03/2022 on Dark patterns in social media platform interfaces (“Guidelines”) offers practical recommendations on assessing ‘dark patterns’ in social media platforms. The Guidelines note that ‘dark patterns’ may have the potential to hinder users’ ability to provide their “freely given, specific, informed and unambiguous consent”, in turn violating their right to privacy from a data protection and consumer protection perspective. As a practical example, an organization may engage in ‘dark patterns’ when: the use of words or visuals convey information to users in either (a) a highly positive outlook, making users feel good or safe, or (b) a highly negative one, making users feel anxious or guilty, particularly in a way to nudge users towards sharing more data as the default option. For more information about the Guidelines, please check out our blog post.
    • CPC – EDPB Joint Principles for Fair Advertising to Children. On June 14, 2022, representatives of the EU’s CPC Network, together with several national data protection authorities in the EU and the secretariat of the EDPB, endorsed five key principles for fair advertising to children (see the press release). These include, for example, taking into account the specific vulnerability of children when designing advertising or marketing techniques that are likely to target children (in particular, it must not deceive or unduly influence children) and not to target, urge or prompt children to purchase in-app or in-game content. This requires organizations to take better care to avoid dark patterns when creating online interfaces that are targeted at children. For more information on these principles and children’s privacy, please take a look at our previous blog post.
  • UCPD. The Unfair Commercial Practices Directive prohibits unfair commercial practices affecting consumers’ economic interests before, during and after the conclusion of a contract. On December 29, 2021, the European Commission published guidance on the UCPD that confirms that the UCPD covers dark patterns and dedicates a section (4.2.7) to explain how the relevant provisions of the UCPD can apply to data-driven business-to-consumer commercial practices.
    • The UCPD covers commercial practices such as capturing the consumer’s attention, which results in transactional decisions such as continuing to using the service (e.g., scrolling through a feed), to view advertising content or to click on a link. To the extent that these practices include dark patterns and are therefore misleading, they would violate the UCPD. For example, dark patterns have the potential of materially distorting the economic behavior of the average consumer in the context of online advertising, and therefore potentially fall under the UCPD (see our blog post).
  • DSA. The DSA specifically prohibits deceptive or nudging techniques, including dark patterns, that could distort or impair a user’s free choice, such as giving more visual prominence to a consent option or repetitively requesting or urging users to make a decision. Additionally, under the DSA, the European Commission is also empowered to adopt delegated acts to define additional practices that may fall within the scope of dark patterns. For more information on the DSA, please see our blog post.
  • DMA. The DMA does not explicitly mention dark patterns, but it imposes obligations on gatekeepers that is described in a similar manner to dark patterns. For example, with respect to free user choice and consent withdrawal, gatekeepers “should not design, organize or operate their online interfaces in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of end users to freely give consent”. To this end, the DMA provides for the ability of users’ to withdraw their consent as easily as it was to give it, and therefore avoid additional burdens. Failure to provide users with an easy mechanism to withdraw their consent would likely be deemed as a dark pattern, and would be considered as a contravention under the DMA. For more detail on the DMA, please check out our blog post.

Proposed Legislation

The regulation around dark patterns is continuously evolving and being incorporated into new legislation, particularly as more studies and investigations shed light on the negative effects that dark patterns have on consumers. The following upcoming rules will also regulate the use of dark patterns:

  • The proposed AI Act. The proposed AI Act sets out rules on the development, placing on the market, and use of artificial intelligence systems (“AI systems”) across the EU. While the AI Act is still undergoing the legislative process, the current proposal prohibits the use of dark patterns within AI systems. Namely, the proposal explicitly prohibits “the placing on the market, putting into service or use of an AI system that deploys subliminal techniques beyond a person’s consciousness in order to materially distort a person’s behaviour in a manner that causes or is likely to cause that person or another person physical or psychological harm”. Therefore, manufacturers of AI systems would be prohibited from using deceptive techniques like dark patterns and will need to take into consideration the general data protection principles promoted by the GDPR, namely transparency, accountability, data minimization, among others, to avoid the use of dark patterns within its AI system. To find out more about the proposed AI Act, please check out our blog post.
  • The proposed Data Act. The proposed Data Act aims to facilitate greater access to and use of data, such as allowing users to access and port to third parties the data generated through their use of connected products and services. As part of this, the third party that receives this data is under an obligation not to “coerce, deceive or manipulate the user in any way, by subverting or impairing the autonomy, decision-making or choices of the user, including by means of a digital interface with the user”. Recital 34 explains that this means that third parties should not rely on dark patterns when designing their digital interfaces, particularly in a way that manipulates consumers to disclose more data — the third party should therefore comply with the data minimization principle as defined in the GDPR to ensure that they do not employ dark pattern practices in their interfaces. For more detail on the proposed Data Act, please see our blog post.
  • Digital Fairness Consultation. On November 28, 2022, the European Commission published a digital fairness public consultation, which is currently open until February 20, 2023. The aim of the consultation is to determine whether it is necessary to update existing consumer protection legislation (i.e., the Unfair Commercial Practices Directive, Consumer Rights Directive, and Unfair Contract Terms Directive) in order to adapt to the digital transformation of the online world. In particular, the European Commission will consider whether existing consumer protection legislation is adequate to protect consumers against novel consumer protection issues, such as online deceptive and nudging techniques, including dark patterns, among other consumer protection concerns (personalization practices, influencer marketing, marketing of virtual items etc.).

Following the consultation, the European Commission will publish a Staff Working Document, which will address these issues and potentially recommend a new legislative proposal that will regulate dark patterns further. In the meantime, the EU has already pursued dark pattern enforcement, for example:

  • As part of the European Commission’s New Consumer Agenda (which encompasses the dark patterns mandate), in April 2022, the European Commission released its Behavioural study on unfair commercial practices in the digital environment, which examines the use of dark patterns and manipulative personalization and identifies the potential gaps in existing consumer protection legislation to tackle concerns relating to dark patterns. The European Commission will contact online traders identified in this study to ask them to rectify the issues identified.
    • As mentioned above, the CPC Network has conducted online sweeps to identify the use of ‘dark patterns’ on websites and apps — the European Commission press release notes that nearly 40% of online shopping websites that they reviewed (148 out of 399) rely on manipulative practices to exploit consumers’ vulnerabilities or trick them (e.g., fake countdown timers, hidden information, and web interfaces designed to lead consumers to purchases, subscriptions or other choices). The relevant member state’s consumer protection authorities will now contact the relevant traders to rectify their websites and take further action if necessary.
    • Enforcement is also likely to be expected on a sectorial basis, such as in the financial sector as evidenced by the statement of the German Federal Financial Supervisory Authority prohibiting dark patterns in trading apps or trading portals, published on November 21, 2022.

Conclusion

The EU is taking significant steps to further protect EU consumers’ rights, especially in the digital realm, and continue to provide additional recommendations for companies to fulfil such goals. With the adoption of the Digital Markets Act and Digital Services Act, and the negotiation of the upcoming legislative proposals, the European institutions are setting the tone for 2023 for more transparency and accountability in digital markets. The focus on regulating dark patterns will likely have far-reaching effects, as demonstrated by its nexus to a multitude of EU legislation. Additionally, as dark patterns regulation will not be constrained to any single regulation, there will be an increasing number of enforcement into dark patterns. For example, dark patterns is also on the enforcement agenda for EU data protection authorities as they investigate the use of dark patterns and the processing of personal data and digital marketing.

*          *          *

Covington will continue monitoring the regulation of dark patterns and the development of relevant EU legislation. We have particular expertise in the digital marketing — if you want to learn more about these matters or have any questions, please do not hesitate to reach out to us.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as the IAPP’s European Advisory Board, Privacy International and the European security agency, ENISA.

Photo of Sam Jungyun Choi Sam Jungyun Choi

Recognized by Law.com International as a Rising Star (2023), Sam Jungyun Choi is an associate in the technology regulatory group in Brussels. She advises leading multinationals on European and UK data protection law and new regulations and policy relating to innovative technologies, such…

Recognized by Law.com International as a Rising Star (2023), Sam Jungyun Choi is an associate in the technology regulatory group in Brussels. She advises leading multinationals on European and UK data protection law and new regulations and policy relating to innovative technologies, such as AI, digital health, and autonomous vehicles.

Sam is an expert on the EU General Data Protection Regulation (GDPR) and the UK Data Protection Act, having advised on these laws since they started to apply. In recent years, her work has evolved to include advising companies on new data and digital laws in the EU, including the AI Act, Data Act and the Digital Services Act.

Sam’s practice includes advising on regulatory, compliance and policy issues that affect leading companies in the technology, life sciences and gaming companies on laws relating to privacy and data protection, digital services and AI. She advises clients on designing of new products and services, preparing privacy documentation, and developing data and AI governance programs. She also advises clients on matters relating to children’s privacy and policy initiatives relating to online safety.

Diane Valat

Diane Valat is a trainee who attended IE University.

Photo of Anna Oberschelp de Meneses Anna Oberschelp de Meneses

Anna Sophia Oberschelp de Meneses is an associate in the Data Privacy and Cybersecurity Practice Group.

Anna is a qualified Portuguese lawyer, but is both a native Portuguese and German speaker.

Anna advises companies on European data protection law and helps clients coordinate…

Anna Sophia Oberschelp de Meneses is an associate in the Data Privacy and Cybersecurity Practice Group.

Anna is a qualified Portuguese lawyer, but is both a native Portuguese and German speaker.

Anna advises companies on European data protection law and helps clients coordinate international data protection law projects.

She has obtained a certificate for “corporate data protection officer” by the German Association for Data Protection and Data Security (“Gesellschaft für Datenschutz und Datensicherheit e.V.”). She is also Certified Information Privacy Professional Europe (CIPPE/EU) by the International Association of Privacy Professionals (IAPP).

Anna also advises companies in the field of EU consumer law and has been closely tracking the developments in this area.

Her extensive language skills allow her to monitor developments and help clients tackle EU Data Privacy, Cybersecurity and Consumer Law issues in various EU and ROW jurisdictions.