Automated Decision-Making and Dark Patterns: Where Do the GDPR and DSA Meet?



November 24, 2023

Timer icon

8 mins

The digital age has brought about revolutionary changes in how we interact with technology, particularly in the realm of automated decision-making (ADM) and deceptive design patterns, commonly known as dark patterns. These technological strategies significantly influence user behavior and decision-making process, often without the user's conscious awareness. The potential for ADM and dark patterns to infringe upon individual rights has thus prompted regulatory scrutiny. The GDPR, for instance, provides guidelines to ensure transparency and fairness and grants users the right to seek human intervention in automated decisions (Articles 13, 14, and 22 of GDPR). The new Digital Services Act also acknowledges the necessity for regulations to prevent manipulative online interfaces.

This article delves into the parallels between ADM and dark patterns, highlighting the critical need for transparency, fairness, and regulatory oversight to protect user rights in the digital environment.


Automated Decision-Making (ADM): This term generally refers to the process of making decisions by technological means without human involvement. In the context of EU data protection law, ADM involves the use of personal data processed by algorithms or computer systems to make decisions that have legal effects on individuals or similarly significantly affect them. The General Data Protection Regulation (GDPR) addresses ADM, particularly in Articles 22, which provides individuals with rights related to automated individual decision-making, including profiling.

Dark Patterns (also referred to as Deceptive Design Patterns): These are interfaces and user journeys implemented on social media platforms and other digital services that aim to influence users into making unintended, unwilling, and/or potentially harmful decisions, often toward an option that is against the users’ best interests and in favor of the social media platforms' interest, with regard to their personal data (Guidelines 03/2022 on deceptive design patterns in social media platform interfaces, page 9). Dark patterns rely on cognitive biases and can hinder users' ability to effectively protect their personal data and make conscious choices.

GDPR, automated decision making and dark patterns

Automated decision-making, including profiling, is regulated under the General Data Protection Regulation (GDPR) primarily in Article 22. Dark patterns are not explicitly mentioned in the GDPR, but they relate to the principles of fairness, transparency, and the requirement for clear and informed consent, which are integral to the GDPR.

According to Article 22 of the GDPR individuals have the right not to be subject to decisions based solely on automated processing, including profiling, which have legal effects or similarly significant consequences. However, there are exceptions where such processing is allowed: (1) It is necessary for entering into or performing a contract. (2) It is authorized by EU or Member State law. (3) It is based on the individual's explicit consent [Regulation (EU) 2016/679; Article 22(2)].

Recital 71 outlines that when automated decision-making is permitted, data controllers must implement appropriate safeguards. These include the right to obtain human intervention, to express one's point of view, and to contest the decision, ensuring that individuals can ensure fairness in the decision-making process.

Articles 13(2)(f), 14(2)(g), and 15(1)(h), lay out the GDPR's transparency rules and oblige data controllers to inform individuals about the existence of automated decision-making, including profiling. They must provide meaningful information about the logic involved, as well as the significance and the expected consequences of such processing.

While the GDPR does not explicitly mention "dark patterns," its principles of transparency, fairness, and the requirement for clear and informed consent are antithetical to dark patterns. The regulation requires that consent be given through a clear affirmative action and that information about data processing be easily accessible and understandable [Regulation (EU) 2016/679; Articles 4(11), 7, and 12]. Moreover, the GDPR promotes data protection by design and by default under Article 25. This principle requires data controllers to implement measures that ensure compliance with the regulation and that privacy settings are set at a high level by default [Regulation (EU) 2016/679; Article 25]. This approach indirectly combats dark patterns by mandating that data processing practices are designed with the user's privacy in mind from the outset.

DSA and dark patterns use by online platforms

The Digital Services Act (DSA) addresses the issue of dark patterns through its provisions aimed at ensuring a transparent and safe online environment. Specifically, the DSA includes measures to prohibit online platforms from using deceptive design patterns that manipulate users into making decisions that are not in their best interests, particularly in relation to the processing of their personal data. The relevant provisions are as follows:

Article 25 (1) of the DSA prohibits providers of online platforms from designing, organizing, or operating their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.

Recital 57 of the DSA explains that to avoid disproportionate burdens, the additional obligations imposed under the Regulation on providers of online platforms should not apply to providers that qualify as micro or small enterprises. However, considering that very large online platforms or very large online search engines have a larger reach and a greater impact, such providers should not benefit from that exclusion, irrespective of whether they qualify or recently qualified as micro or small enterprises.

Recital 67 of the DSA specifically addresses dark patterns by stating that dark patterns on online interfaces of online platforms are practices that materially distort or impair the ability of recipients of the service to make autonomous and informed choices or decisions. The DSA prohibits providers of online platforms from deceiving or nudging recipients of the service and from distorting or impairing their autonomy, decision-making, or choice via the structure, design, or functionalities of an online interface .

Impact of ADM and dark patterns on data subjects

Influence on User Decisions:

Automated decision-making systems, on the one hand, can influence decisions by processing large amounts of data and applying algorithms to generate outcomes without human intervention [Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679]. Deep patterns, or dark patterns, on the other hand, are design strategies used in online interfaces that manipulate users into making decisions that may not be in their best interest, often favoring the interests of the service provider [Digital Services Act, recital 67].

Lack of Transparency:

A significant concern with both ADM and dark patterns is the opacity surrounding their operation. Users are often left in the dark regarding how decisions are made or why certain design choices are presented to them, leading to a lack of informed consent and difficulty in holding service providers accountable. The "black box" nature of some AI systems used in automated decision-making can obscure the reasoning behind decisions, similar to how deep patterns obscure the true intentions behind certain design choices [EDPS Opinion on the European Commission’s White Paper on Artificial Intelligence – A European approach to excellence and trust, paragraph 48].

Potential for Unfair Outcomes:

Both ADM and dark patterns carry the risk of leading to biased or discriminatory outcomes. In the case of ADM, if the underlying data or algorithms are flawed, the decisions made can be unfair. Similarly, dark patterns can coerce users into unfavorable actions, such as sharing personal data without clear intent or making purchases by mistake [Digital Services Act, recital 67]. There is, therefore, a recognition of the importance of human oversight in both automated decision-making and in the design of online interfaces to prevent negative impacts on users. The EDPS specifically highlights that the inclusion of human judgment and the ability to contest automated decisions or manipulative design practices as essential for safeguarding individual rights [EDPS Opinion on the European Commission’s White Paper on Artificial Intelligence – A European approach to excellence and trust, paragraph 47 and 48].

In summary, both automated decision-making and deep patterns involve the use of technology to influence user decisions, often without sufficient transparency or consideration of the user's best interests. They both raise concerns about potential biases, the need for regulatory oversight, and the importance of human intervention to ensure fairness and respect for individual rights.

This article has been written with the help of CuratedAI - The AI Assistant for European Lawyers. Try it out now!