Automated Decision-Making and Profiling (Article 22)
Automated Decision-Making and Profiling under Article 22 of the General Data Protection Regulation (GDPR) is a critical provision designed to protect individuals from decisions made solely by automated processes that significantly affect them. This article grants data subjects the right not to be s… Automated Decision-Making and Profiling under Article 22 of the General Data Protection Regulation (GDPR) is a critical provision designed to protect individuals from decisions made solely by automated processes that significantly affect them. This article grants data subjects the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects or similarly significantly affects them. Profiling is defined as any form of automated processing of personal data that evaluates personal aspects relating to a natural person, particularly to analyze or predict aspects concerning work performance, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements. There are three exceptions where automated decision-making is permitted: (1) when it is necessary for entering into or performing a contract between the data subject and the data controller; (2) when it is authorized by EU or Member State law with suitable safeguards; or (3) when it is based on the data subject's explicit consent. When automated decisions are made under exceptions (1) or (3), the data controller must implement suitable measures to safeguard the data subject's rights, freedoms, and legitimate interests. At minimum, the data subject has the right to obtain human intervention, express their point of view, and contest the decision. Additionally, automated decisions should not be based on special categories of personal data (such as race, health, or biometric data) unless the data subject has given explicit consent or processing is necessary for substantial public interest, with appropriate safeguards in place. Data controllers must also provide meaningful information about the logic involved in automated decision-making, the significance, and envisaged consequences for the data subject under Articles 13 and 14 (transparency obligations). Data Protection Impact Assessments (DPIAs) are typically required for systematic profiling activities. This provision reflects the GDPR's commitment to ensuring human oversight over significant algorithmic decisions affecting individuals' lives.
Automated Decision-Making and Profiling (Article 22) – A Comprehensive Guide
Introduction
Automated decision-making and profiling under Article 22 of the General Data Protection Regulation (GDPR) is one of the most frequently tested and nuanced topics in the CIPP/E certification exam. Understanding this provision is essential not only for passing the exam but also for advising organizations on lawful data processing practices. This guide provides a thorough explanation of what automated decision-making and profiling are, why they matter, how they work in practice, and how to approach exam questions on this topic.
Why Is This Topic Important?
Automated decision-making and profiling have become pervasive in modern business operations. From credit scoring and insurance underwriting to targeted advertising and recruitment screening, organizations increasingly rely on algorithms and automated systems to make decisions about individuals. These practices raise significant concerns about:
• Fundamental rights and freedoms: Decisions made without meaningful human involvement can adversely affect individuals' access to services, employment, credit, and other opportunities.
• Transparency and fairness: Individuals may not understand how decisions about them are being made, leading to a lack of accountability.
• Discrimination: Algorithms can perpetuate or amplify biases present in training data, leading to discriminatory outcomes.
• Data protection principles: Automated decision-making must comply with core GDPR principles such as lawfulness, fairness, transparency, data minimisation, and purpose limitation.
Article 22 is the GDPR's primary safeguard against the risks posed by solely automated decision-making, including profiling. For the CIPP/E exam, this topic intersects with several other areas including data subject rights, lawful bases for processing, Data Protection Impact Assessments (DPIAs), and the use of special categories of data.
What Is Automated Decision-Making?
Automated decision-making refers to the process of making a decision by automated means without any human involvement. This includes decisions made by algorithms, artificial intelligence systems, or other technological tools that process personal data and produce a decision without a human reviewing or contributing to the outcome.
Key characteristics include:
• The decision is made solely by automated means.
• There is no meaningful human intervention in the decision-making process.
• The decision produces legal effects or similarly significantly affects the data subject.
What Is Profiling?
Article 4(4) of the GDPR defines profiling as any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's:
• Performance at work
• Economic situation
• Health
• Personal preferences
• Interests
• Reliability
• Behaviour
• Location
• Movements
It is critical to understand that profiling and automated decision-making are not the same thing, though they often overlap. Profiling is a form of processing; automated decision-making is a type of decision. Profiling can occur without automated decision-making (e.g., when a human reviews the profile before making a decision), and automated decision-making can occur without profiling (e.g., a speed camera automatically issuing a fine).
What Does Article 22 Actually Say?
Article 22(1) provides that: "The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her."
This establishes a general prohibition (or, as debated by scholars, a right) against solely automated decision-making that has legal or similarly significant effects. The Article 29 Working Party (now the European Data Protection Board, or EDPB) has interpreted this as a general prohibition rather than merely a right that must be invoked by the data subject.
Three Conditions That Must Be Met for Article 22 to Apply:
1. The decision is based solely on automated processing – There is no meaningful human involvement. Token human involvement (rubber-stamping) does not count. The human must have the authority and competence to change the decision.
2. Including profiling – The automated processing may or may not involve profiling. The phrase "including profiling" clarifies that profiling-based decisions are covered but does not limit the scope to profiling alone.
3. The decision produces legal effects or similarly significantly affects the data subject – Examples of legal effects include cancellation of a contract, denial of a social benefit, or refusal of citizenship. "Similarly significant effects" include decisions that affect access to health services, denial of employment, or refusal of a loan. The EDPB guidelines state that the effect must be sufficiently great or significant to warrant attention, such as affecting someone's financial circumstances, access to essential services, or employment opportunities.
Exceptions to the Prohibition (Article 22(2))
The prohibition does not apply if the automated decision:
1. Is necessary for entering into, or performance of, a contract between the data subject and a data controller. Example: An automated credit decision that is necessary to process a loan application.
2. Is authorised by EU or Member State law to which the controller is subject, and which lays down suitable measures to safeguard the data subject's rights, freedoms, and legitimate interests. Example: National tax legislation that permits automated fraud detection.
3. Is based on the data subject's explicit consent. The consent must meet the high GDPR standard of being freely given, specific, informed, and unambiguous, and must be explicit (a higher bar than ordinary consent).
Safeguards (Article 22(3))
Where one of the exceptions in Article 22(2)(a) or (c) applies (contract necessity or explicit consent), the data controller must implement suitable measures to safeguard the data subject's rights, freedoms, and legitimate interests. At a minimum, these must include the right to:
• Obtain human intervention on the part of the controller
• Express his or her point of view
• Contest the decision
These safeguards are mandatory and cannot be waived by the data subject.
Special Categories of Data (Article 22(4))
Automated decisions, including those based on profiling, must not be based on special categories of personal data (as defined in Article 9(1)) unless:
• The data subject has given explicit consent (Article 9(2)(a)), OR
• Processing is necessary for reasons of substantial public interest (Article 9(2)(g))
AND suitable measures to safeguard the data subject's rights, freedoms, and legitimate interests are in place.
Transparency and Information Obligations
Articles 13(2)(f) and 14(2)(g) require controllers to inform data subjects about:
• The existence of automated decision-making, including profiling
• Meaningful information about the logic involved
• The significance and envisaged consequences of such processing for the data subject
This information must be provided at the time of data collection (Article 13) or within a reasonable period when data is not obtained directly from the data subject (Article 14). This transparency obligation applies even where the processing falls within one of the Article 22(2) exceptions.
Additionally, Article 15(1)(h) gives data subjects the right to obtain this same information through a subject access request.
The Role of Data Protection Impact Assessments (DPIAs)
Article 35(3)(a) explicitly states that a DPIA is required for "a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person."
This means that most instances of automated decision-making that fall under Article 22 will require a DPIA. This is a frequently tested connection in the CIPP/E exam.
How It Works in Practice
Consider a practical example: An online lender uses an algorithm to assess loan applications. The algorithm evaluates the applicant's credit history, income, spending patterns, and social media activity to produce a credit score. Applications scoring below a threshold are automatically rejected without any human review.
In this scenario:
• The decision (loan rejection) is solely automated.
• It involves profiling (evaluating economic situation, reliability, behaviour).
• It produces a legal effect (denial of a financial product) or at least similarly significantly affects the applicant.
• Article 22 applies, and the lender must either cease the practice or rely on one of the exceptions.
• If the lender relies on the contract necessity exception, it must still provide the three minimum safeguards (human intervention, right to express a view, right to contest).
• The lender must inform applicants about the automated decision-making, provide meaningful information about the logic, and explain the consequences.
• A DPIA must be conducted.
• If special category data is used (e.g., health data), explicit consent or substantial public interest grounds are required.
Key Distinctions to Remember
• Profiling alone ≠ Article 22: Profiling that does not lead to a solely automated decision with legal or similarly significant effects is not caught by Article 22 (though it must still comply with other GDPR provisions).
• Human involvement must be meaningful: Having a human nominally in the loop who simply rubber-stamps the automated output does not take the processing outside Article 22. The human must have genuine authority, competence, and the ability to override the automated decision.
• "Similarly significantly affects": The EDPB has indicated this includes decisions affecting access to health services, employment opportunities, or educational opportunities. Targeted advertising generally does not meet this threshold, though it could in some circumstances (e.g., if it exploits vulnerabilities).
• Article 22 vs. Article 21 (Right to Object): Article 21 provides a right to object to profiling for direct marketing purposes (absolute right) and profiling based on legitimate interests or public interest (qualified right). Article 22 addresses solely automated decisions. These are distinct rights, though they can overlap.
EDPB/WP29 Guidelines
The Article 29 Working Party issued Guidelines on Automated Individual Decision-Making and Profiling (WP251rev.01), which were endorsed by the EDPB. Key points from these guidelines that are exam-relevant include:
• Article 22(1) is interpreted as a prohibition, not merely a right that must be actively invoked.
• "Meaningful information about the logic involved" does not require disclosure of the full algorithm or source code, but should be sufficient for the data subject to understand the rationale and challenge the decision.
• Controllers should use mathematical or statistical procedures that are appropriate, implement technical and organisational measures to correct inaccuracies, and minimise the risk of errors.
• Controllers should conduct regular checks to ensure that systems are working as intended and are not producing discriminatory effects.
Exam Tips: Answering Questions on Automated Decision-Making and Profiling (Article 22)
1. Read the question carefully for the three triggers: Always check whether the scenario involves (a) a solely automated decision, (b) including or not including profiling, and (c) legal or similarly significant effects. If any of these elements is missing, Article 22 may not apply.
2. Distinguish between profiling and automated decision-making: The exam frequently tests whether candidates understand that these are different concepts. Profiling is a type of processing; automated decision-making is a type of decision. They can exist independently or together.
3. Know the three exceptions cold: Contract necessity, authorisation by law, and explicit consent. Be prepared to identify which exception applies in a given scenario and what safeguards must accompany it.
4. Remember the mandatory safeguards: Human intervention, right to express a point of view, and right to contest the decision. These apply when relying on the contract necessity or explicit consent exceptions.
5. Link to transparency obligations: Questions may test whether you know that data subjects must be informed about the existence of automated decision-making, the logic involved, and the significance and consequences. Remember Articles 13(2)(f), 14(2)(g), and 15(1)(h).
6. Connect to DPIAs: If the question involves systematic profiling leading to decisions with legal or significant effects, a DPIA is required under Article 35(3)(a). This is a common cross-topic question.
7. Watch for special category data: If the scenario involves health data, biometric data, racial or ethnic origin, or other Article 9 data, remember that only explicit consent or substantial public interest can justify automated decision-making with such data.
8. Assess whether human involvement is meaningful: If a scenario describes a human who simply confirms an automated output without exercising independent judgment, the decision is still "solely automated" for Article 22 purposes.
9. Don't confuse the right to object (Article 21) with Article 22: Article 21 covers objecting to processing (including profiling) based on certain grounds. Article 22 covers the right not to be subject to solely automated decisions. Know when each applies.
10. Consider the "similarly significantly affects" threshold: Not every automated decision triggers Article 22. The effect must be comparable to a legal effect in terms of its impact on the individual. Routine personalisation of content or non-consequential recommendations typically do not meet this threshold.
11. Use the process of elimination: For multiple-choice questions, eliminate answers that confuse profiling with automated decision-making, that omit mandatory safeguards, or that apply the wrong lawful basis or exception.
12. Remember the EDPB interpretation: Article 22(1) is a prohibition, not merely a right. This means controllers cannot engage in solely automated decision-making with legal/significant effects unless an exception applies – data subjects do not need to actively invoke this right.
13. Practice scenario-based analysis: Many CIPP/E questions present a factual scenario and ask you to identify the correct legal analysis. Practice breaking down scenarios into their component parts: What data is being processed? Is there profiling? Is the decision solely automated? What are the effects? Which exception applies? What safeguards are needed?
Summary
Article 22 of the GDPR provides a critical protection for individuals against the risks of solely automated decision-making, including profiling, where such decisions produce legal or similarly significant effects. The provision establishes a general prohibition with three narrowly defined exceptions, each accompanied by mandatory safeguards. For the CIPP/E exam, mastering this topic requires understanding the interplay between automated decision-making, profiling, transparency obligations, DPIAs, special category data, and the distinction between Article 22 and Article 21. By systematically analysing exam scenarios against these elements, candidates can confidently and accurately answer questions on this important area of data protection law.
Unlock Premium Access
Certified Information Privacy Professional/Europe
- Access to ALL Certifications: Study for any certification on our platform with one subscription
- 2070 Superior-grade Certified Information Privacy Professional/Europe practice questions
- Unlimited practice tests across all certifications
- Detailed explanations for every question
- CIPP/E: 5 full exams plus all other certification exams
- 100% Satisfaction Guaranteed: Full refund if unsatisfied
- Risk-Free: 7-day free trial with all premium features!