Earlier this year, the UK’s privacy and competition regulators (the ICO and CMA) issued a joint paper setting out their concerns and expectations in the field of dark patterns – techniques designed to mislead or deceive users of online services – which the regulators refer to as “harmful online choice architectures”. As we’ve previously noted, dark patterns are an area of increasing focus of regulators, and the joint paper reflects the growing interplay between privacy and competition laws – a trend we expect to see continue in 2024.

Privacy and competition issues go together

The regulators’ decision to issue a joint paper reflects the fact that dark patterns raise both privacy and competition law concerns. For example, dark patterns that prevent a user from understanding that their data will be shared across a company’s different products may both prevent the user from understanding how their data is being used (potentially in breach of privacy laws) and may give that company an unfair competitive advantage against other companies who do not have access to the same data (potentially in breach of competition laws). This interplay between privacy and competition law isn’t new – the UK’s privacy and competition regulators have been cooperating through their Digital Regulation Cooperation Forum since 2020 – but the joint paper serves as a timely reminder that these issues require cross-disciplinary expertise and should be considered together.

Specific dark patterns raised by the regulators

The joint paper gives 5 examples of dark patterns that are of concern to both regulators:

  • nudges and sludges are techniques that encourage users to take actions that are potentially harmful to them, or that make it more difficult to take the actions that they wish, such as providing an easy “accept all” button to opt-in to data collection but requiring users to take multiple steps to opt-out of a data collection;
  • confirmshaming is a technique that presents certain activities, such as withholding consent to data collection, as undesirable – the joint paper gives the example of a shopping website allowing users to sign up to a mailing list to receive a discount using emotive language such as “yes, I want to save!” or “nah, I don’t like discounts!”, instead of more neutral language such as, “subscribe to receive emails” or “do not subscribe to receive emails”;
  • biased framing is a technique that involves presenting certain options in a more favourable light than other options – for example, through website text that emphasises the benefits of opting-in to data collection without providing equal discussion of the harms of opting-in to data collection;
  • bundled consent is a technique that involves using a single consent to cover multiple activities, instead of requesting consent for each one; and
  • default settings involve making one option the standard or default option, while requiring active intervention to switch to another option – for example, by pre-ticking certain boxes.

In the joint paper, the ICO emphasises that these practices undermine user choice, and therefore risk breaching UK GDPR principles of fairness, transparency, and data protection by design / default, and well as rendering invalid any user consent obtained through dark patterns. The CMA, in turn, emphasises that these practices may allow large companies to collect more data than users intend to provide, to preference their own services over their competitors’, and to lock consumers into their ecosystem of services, thereby strengthening their market power and unfairly inhibiting market entry by smaller competitors.

Dark patterns facing increased scrutiny

As we’ve previously written, dark patterns are increasingly an area of focus for regulators in the UK and across Europe.

In the UK, the ICO’s Age Appropriate Design Code includes explicit obligations in relation to “nudge” techniques, while the CMA has issued a paper on the competition issues associated with dark patterns and emphasised misleading consumer practices resulting from them as an enforcement priority in its 2023-24 annual plan.

This focus will further intensify with the Digital Markets, Competition and Consumers Bill (currently making its way through Parliament), which would empower the CMA with new enforcement tools to tackle competition and consumer issues arising from dark patterns.  Instead of relying on traditional competition law doctrines, the CMA will be able to impose (a) bespoke conduct requirements on designated firms, such as an obligation to design options in a way that allows users to make informed and effective decisions; and (b) structural or behavioural pro-competition interventions to tackle deeper or more widespread issues, such as imposing on designated firms a duty of “fairness by design” when presenting choices to users about their personal data.

Meanwhile, in the European Union, the EU Consumer Protection Cooperation recently conducted a review of 400 retail websites to identify dark patterns (see our blog here), the European Data Protection Board issued guidelines on dark patterns (see our blog here), and the Italian Garante recently fined a digital marketing company for its use of dark patterns (see our blog here). Dark patterns also feature in the text of new European legislation, including the Digital Services Act and Data Act.

In light of this increasing regulatory scrutiny, companies should consider reviewing their existing digital services for the presence of dark patterns (or interfaces that regulators may perceive as being dark patterns) and assess the privacy, competition, and consumer law challenges those dark patterns present.


Covington frequently advises on the regulation of dark patterns, including on privacy and competition issues, and we have particular expertise in the field of digital marketing. If you want to learn more about these matters or have any questions, please do not hesitate to reach out to us.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as the IAPP’s European Advisory Board, Privacy International and the European security agency, ENISA.

Photo of Vicky Ling Vicky Ling

Vicky Ling is an associate in Covington’s competition team. She advises on all aspects of EU and UK competition law, including merger control, abuse of dominance, antitrust litigation, regulatory investigations and enforcement.