top of page

Dark patterns and ethical design in data privacy


Design choices made by websites and applications influence user choice in data privacy. They can foster autonomy, providing users with relevant information in a structured way that allows them to act easily on it. Or they can do the exact opposite. Some of the particularly egregious offenders are now called “dark patterns” and were for the first time expressly outlawed in the EU with the recent adoption of the Digital Services Act. The examples below offer some instances of websites leading by counterexample, and tips on how to foster user autonomy and trust by better designing choice architectures.


What are dark patterns?


In essence, dark patterns are tricks in the digital realm that make consumers do things that they didn’t mean to do. Websites and apps can employ deceptive design, which exploits cognitive biases and limits consumer autonomy, to manipulate users into making choices that they otherwise wouldn’t have made. In a consumer context, they make people buy things they otherwise would not have bought; in data privacy, they make users disclose more data than they otherwise would have disclosed.

 

The EU study on unfair commercial practices talks about choices that are not in the best interests of the consumer, however a better framing is that found in the Digital Services Act: “practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions.” At the heart of the issue lies not that a choice was a bad one, but that there was no genuine choice to speak of.


Examples


Dark patterns can take be found both outside and within the area of data privacy, and they are everywhere — 97% of the most popular websites and apps used by EU consumers deployed at least one dark pattern. Some common examples include:


"Confirmshaming"

the entertainingly named “confirmshaming” makes users feel bad about choosing not to engage with the offered feature;



“Hard to cancel”

makes it disproportionately difficult to unsubscribe or resign from a service;



“Preselection”

shows a pre-ticked checkbox or otherwise exploits the default bias;



“Trick wording”

 double negatives or combining consent to two items in one sentence. 



Every one of these designs makes the more data-intensive option an easier path and makes rejecting notifications / cookies / subscription / email disproportionately burdensome. Most of the examples above are even more egregious 2-in-1 dark pattern combinations: in addition to the specific pattern they were meant to illustrate, they also employ “visual interference” and “asymmetric choice,” obscuring the rejection buttons and favouring the “allow”/ “OK” option.


The law


The employment of dark patterns in data privacy has recently been expressly banned for the first time by the DSA (Digital Services Act) in the European Union, and in the amendments to the CCPA (California Consumer Privacy Act) through the CPRA in the USA. Most of these deceptive designs were however already unlawful under the existing data protection regimes in those territories.


The GDPR provides certain general principles and guidelines for how personal data ought to be processed. It should be done transparently in relation to the data subject, fairly (in a way that does not exploit the power inequality between platform and customer), for a specific purpose, and only insofar and insomuch as necessary to meet that purpose. The entity in control of the data should also implement design and defaults that protect personal data.

 

These broad strokes can be somewhat difficult to apply in practice, and if this was the extent of the Regulation, it could be understandable how dark patterns are still so pervasive in the digital sphere. However, the law does also offer more specific guidelines on consent and choice.  

 

Under the GDPR (General Data Protection Regulation), consent is an indication of wishes of the data subject by clear affirmative action, and must be freely given, specific, informed and unambiguous. For personal data to be processed on the basis of consent, consent must be asked for “in a manner clearly distinguishable from other matters, in an intelligible and easily accessible form, using clear and plain language.” It must also be “as easy to withdraw as to give consent.” The requirements for consent under the ePrivacy Directive are fundamentally the same.

 

On the examples of dark patterns above, we can see how the designs are at odds with the law.


“Confirmshaming”

interferes with consent being “freely given.” Recital 42(5) notes that consent ought not to be considered freely given if “the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.”

 

“Hard to cancel”

makes it harder to withdraw consent than it was to give it, and both “preselection” and “asymmetric choice” make it harder to decline than to agree.

 

“Preselection”

is additionally at odds with data protection by design and by default, and the principle of data minimisation.

 

“Trick wording”

is inconsistent with the requirements of intelligibility, accessibility, and clear and plain language. In the example above, it also fails to obtain granular consent for each category of data collected. 


Deceptive design in data privacy has been unlawful at least as since the GDPR came into force in May 2018, and longer in the context of commercial practices and cookies, since the ePrivacy Directive came into effect in 2011. Nevertheless, dark patterns are still ubiquitous on the Internet in 2024.


What will the Digital Services Act change, then?


It will require large platforms to publish their active user numbers from the 17th of February 2024. It will impose obligations on very large platforms and search engines to perform annual risk assessments. It will not affect the lawfulness of dark patterns in data privacy. Those have been and will remain unlawful. It may, however, affect enforcement.

 

The DSA will impose prohibitions against manipulative and deceptive design to online platform providers (but not all intermediary services). It will also give the European Commission the right to issue guidance on the application of those prohibitions to three specific types of dark patterns: “asymmetric choice,” “nagging” — “repeatedly requesting a recipient of the service to make a choice where such a choice has already been made, especially by presenting a pop-up that interferes with user experience”, and “hard to unsubscribe.”

 

These provisions will extend to dark patterns in data privacy as well, so the impact of the DSA will depend on its practical enforcement and therefore remains to be seen. If it is enforced better than the GDPR, it might have a notable impact on the actual pervasiveness of dark patterns in data privacy. 

 

Best practices


The European Data Protection Board’s guidance on cookie banners offers some useful, generalisable tips for choice architecture design in the wider data privacy context. Many of them will sound familiar – they stem directly from the legal provisions we discussed in the previous section. The gist of the advice is: make refusing/withdrawing consent as easy as giving it.

  Practical suggestions:

1. Require an affirmative act for agreement, treat omissions as a rejection.

That means no pre-ticked boxes and adopting privacy-protective defaults.


2.     Inform users of their right to refuse their consent to data processing and all but strictly necessary cookies.

Remind them that if they refuse consent to their data being processed, they are still entitled to access the website or service, unless it is then legally or technologically impossible. For example, if they refuse access to their location data, it will not be possible to show them nearby cafes. It will, however, still be able to show them cafes close to a specific address.

 

3.     Include a “reject” button within the same layer as the “accept” button, ideally in the first layer of a banner.

A good rule of thumb is that refusing consent should not require more clicks than granting it.

 

4.     Keep the different option buttons a similar size and colour.

Do not highlight the “accept all” button over the other available options, but don’t highlight the “reject all” button either. Without one option appearing more visually attractive, the user is more likely to consider both and make an informed decision.

 

5.     Have a small, hovering icon permanently visible on the screen, through which users can access and change their privacy preferences at any time.

Consent is not a one-off, it’s a continuous state.


These tips for designing choice architectures within data privacy will not only help ensure compliance, but also promote data subject autonomy & informed consent, and foster trust between digital service providers and their customers.


Key Takeaways


The choices you make when designing a consent architecture – how the boxes look, how the options are phrased, how many clicks it takes to confirm versus reject – all affect the choices the user makes. They are consequential, and can foster informed choice & user autonomy, or they can undermine it.

 

Choice architectures that erode user autonomy & push users to provide consent without meaning to are called “dark patterns” and they are unlawful practices under the current data protection regimes in the EU and the US. Consent obtained through a dark pattern is not freely given, and often fails to meet the multiple other legal requirements for validity as well. Processing data on the basis of “consent” obtained in this way is unlawful; it opens the data controller to liability and reputational damage.

 

When designing a consent architecture, it is always worth asking: is it as easy to refuse as it is to agree? If the answer is yes, you’re on the right path. Refer to our best practices above for more tips.


bottom of page