Roles Under AI Laws: Providers, Deployers, Importers and Distributors
Under emerging AI laws, particularly the EU AI Act, distinct roles are defined to assign responsibilities across the AI value chain. These roles ensure accountability at every stage of an AI system's lifecycle. **Providers** are entities that develop or commission the development of an AI system a… Under emerging AI laws, particularly the EU AI Act, distinct roles are defined to assign responsibilities across the AI value chain. These roles ensure accountability at every stage of an AI system's lifecycle. **Providers** are entities that develop or commission the development of an AI system and place it on the market or put it into service under their own name or trademark. Providers bear the most significant obligations, including conducting conformity assessments, ensuring compliance with technical standards, implementing risk management systems, maintaining documentation, and establishing post-market monitoring. They are responsible for the AI system's design, safety, and overall compliance before it reaches end users. **Deployers** (sometimes called 'users' in regulatory contexts) are organizations or individuals that use AI systems under their authority, except for personal non-professional use. Deployers must ensure they use AI systems in accordance with instructions provided by the provider, monitor the system's operation, report malfunctions or risks, conduct data protection impact assessments where applicable, and ensure human oversight is maintained. They are responsible for the contextual application of the AI system. **Importers** are entities established within a jurisdiction (e.g., the EU) that place AI systems from third-country providers onto the market. Importers must verify that the provider has completed required conformity assessments, that proper documentation exists, and that the AI system bears necessary markings and compliance indicators. They serve as a critical gateway ensuring foreign-developed AI meets domestic standards. **Distributors** are entities in the supply chain, other than providers or importers, that make AI systems available on the market. Distributors must verify that the system carries required conformity markings and documentation and must not supply systems they know to be non-compliant. These role-based frameworks create a layered accountability structure, ensuring that every entity handling an AI system shares appropriate responsibility for its safety, transparency, and legal compliance throughout its lifecycle.
Roles Under AI Laws: Providers, Deployers, Importers and Distributors – A Comprehensive Guide
Why This Topic Is Important
Understanding the distinct roles defined under AI legislation — particularly the EU AI Act — is critical for anyone working in AI governance, privacy, or compliance. AI laws assign specific obligations to different actors in the AI value chain, and misidentifying a role can lead to misallocated responsibilities, regulatory non-compliance, and significant penalties. For exam purposes, this topic frequently appears in scenario-based questions where you must correctly identify which entity bears which obligation. Mastering these roles is foundational to understanding how AI regulation operates in practice.
What Are the Roles Under AI Laws?
AI legislation, most notably the EU AI Act, defines several key roles along the AI supply chain. Each role carries distinct legal obligations and responsibilities:
1. Provider (Developer)
A provider is the entity that develops an AI system or has an AI system developed on its behalf and places it on the market or puts it into service under its own name or trademark. This is analogous to the concept of a "manufacturer" in product safety law.
Key characteristics of a Provider:
- Develops or commissions the development of an AI system
- Places the system on the market or puts it into service
- Does so under its own name or trademark
- Bears the heaviest regulatory obligations
Provider obligations typically include:
- Conducting conformity assessments before placing high-risk AI on the market
- Establishing and maintaining a quality management system
- Drawing up technical documentation
- Ensuring the AI system complies with relevant requirements (data governance, transparency, accuracy, robustness, cybersecurity)
- Registering the AI system in the EU database (for high-risk systems)
- Implementing post-market monitoring
- Reporting serious incidents to authorities
- Affixing CE marking where required
- Keeping logs generated by the AI system (where applicable)
2. Deployer (User)
A deployer is any entity that uses an AI system under its authority, except where the AI system is used in the course of a personal, non-professional activity. In earlier drafts of the EU AI Act, this role was referred to as the "user." The deployer is the organization that puts the AI system to work in a real-world operational context.
Key characteristics of a Deployer:
- Uses an AI system in a professional or commercial capacity
- Does not develop the AI system but applies it operationally
- May configure, customize, or fine-tune the system within parameters set by the provider
Deployer obligations typically include:
- Using the AI system in accordance with the provider's instructions for use
- Ensuring human oversight as required by the AI Act
- Monitoring the operation of the AI system based on instructions
- Keeping logs automatically generated by the AI system (for a specified period)
- Conducting a Data Protection Impact Assessment (DPIA) where required under GDPR
- Conducting a Fundamental Rights Impact Assessment (FRIA) for certain high-risk AI systems (applicable to public bodies and certain private entities)
- Informing natural persons that they are subject to a high-risk AI system
- Suspending use and notifying the provider or distributor if a serious incident occurs
Important note: A deployer can become a provider if it substantially modifies the AI system, places the system on the market under its own name, or changes the intended purpose of a high-risk AI system. This role-shifting concept is frequently tested in exams.
3. Importer
An importer is any entity located or established in the EU that places an AI system on the EU market that has been developed by a provider established outside the EU. The importer acts as a critical gatekeeper ensuring that AI systems entering the EU market comply with the AI Act.
Key characteristics of an Importer:
- Located or established within the EU
- Places a third-country (non-EU) provider's AI system on the EU market
- Serves as a compliance bridge between foreign providers and EU regulators
Importer obligations typically include:
- Verifying that the provider has carried out the appropriate conformity assessment
- Verifying that the provider has drawn up the required technical documentation
- Ensuring the AI system bears the required CE marking and is accompanied by required documentation and instructions for use
- Ensuring that the provider has appointed an authorised representative in the EU (where required)
- Not placing an AI system on the market if they have reason to believe it does not comply with the AI Act
- Providing their name, registered trade name or trademark, and contact details on the AI system or its packaging
- Ensuring appropriate storage and transport conditions do not jeopardize the AI system's compliance
- Cooperating with national competent authorities
- Keeping a copy of the EU declaration of conformity
4. Distributor
A distributor is any entity in the supply chain, other than the provider or the importer, that makes an AI system available on the EU market. Distributors do not develop, import, or deploy the system — they facilitate its availability.
Key characteristics of a Distributor:
- Part of the supply chain but is not the provider, importer, or deployer
- Makes the AI system available on the market (e.g., resellers, retailers, platform marketplaces)
Distributor obligations typically include:
- Verifying that the high-risk AI system bears the required CE marking
- Verifying that the provider and importer have fulfilled their documentation obligations
- Not making an AI system available on the market if they have reason to believe it is non-compliant
- Ensuring that storage or transport conditions do not jeopardize the system's compliance
- Informing the provider or importer if the system presents a risk
- Cooperating with competent authorities upon request
How the Role Framework Works in Practice
The AI Act uses a product safety approach inspired by the EU's New Legislative Framework (NLF). Each actor in the value chain has obligations proportionate to their influence over the AI system:
- Providers have the most extensive obligations because they design and create the system.
- Deployers have significant obligations because they determine how the system is used in practice and directly affect individuals.
- Importers serve as gatekeepers ensuring foreign-made AI systems meet EU standards.
- Distributors have lighter obligations focused on due diligence and verification before making AI systems available.
The framework recognizes that roles can shift. For example:
- A deployer who substantially modifies a high-risk AI system becomes a provider for that modified system.
- A deployer who places the AI system on the market under its own name becomes a provider.
- A deployer who changes the intended purpose of a high-risk AI system is treated as a provider and must comply with provider obligations.
- A distributor or importer who modifies an AI system or changes its intended purpose also becomes a provider.
This role-shifting mechanism ensures that obligations follow the entity that has actual control and influence over the system's safety and compliance.
Comparison Table
Provider: Develops or commissions development; places on market under own name; bears heaviest obligations (conformity assessment, technical documentation, post-market monitoring, quality management system, incident reporting).
Deployer: Uses the AI system professionally; follows instructions; ensures human oversight; monitors operations; conducts FRIA/DPIA; can become a provider through substantial modification or rebranding.
Importer: EU-based entity bringing non-EU AI systems into the EU market; verifies conformity assessment and documentation; acts as gatekeeper.
Distributor: Makes AI system available on the market (not provider, importer, or deployer); verifies CE marking and documentation; lighter obligations focused on due diligence.
Key Distinctions to Remember for the Exam
1. Provider vs. Deployer: The provider creates; the deployer uses. If a deployer substantially modifies, rebrands, or repurposes the system, they become a provider.
2. Importer vs. Distributor: The importer is the first entity to bring a non-EU AI system into the EU market. The distributor makes systems available that are already on the EU market. Both have verification duties, but importers have more extensive obligations.
3. Role-shifting: Any actor (deployer, importer, distributor) can become a provider by substantially modifying the AI system, placing it on the market under their own name, or changing its intended purpose.
4. Obligations scale with influence: The entity with the most control over the AI system's design and functionality (the provider) bears the most responsibility.
5. Analogy to GDPR: While not a perfect analogy, the provider/deployer relationship loosely parallels the processor/controller relationship under GDPR. The deployer (like a controller) determines how the AI system is used in its operational context, while the provider (like a processor in some ways) provides the tool. However, unlike GDPR processors, AI Act providers bear heavier obligations than deployers in many respects.
Exam Tips: Answering Questions on Roles Under AI Laws
Tip 1: Identify the Role by Actions, Not Labels
Exam questions often present scenarios without explicitly naming the role. Focus on what the entity does: Does it develop the system? Use it? Bring it into the EU from abroad? Resell it? Match the actions to the role definition.
Tip 2: Watch for Role-Shifting Scenarios
A very common exam pattern is presenting a deployer who modifies an AI system and asking what obligations now apply. Remember: substantial modification, rebranding, or repurposing transforms a deployer into a provider.
Tip 3: Remember the Gatekeeper Function of Importers
Importers must verify that the non-EU provider has completed conformity assessments and documentation. If asked about who ensures compliance of a foreign-made AI system entering the EU, the answer is the importer.
Tip 4: Distributors Have the Lightest Obligations
Distributors mainly verify that CE markings and documentation are in place. They do not conduct conformity assessments or create technical documentation.
Tip 5: Link Obligations to the Correct Role
Conformity assessments, technical documentation, quality management systems, and post-market monitoring are provider obligations. Human oversight, FRIA, and using the system per instructions are deployer obligations. Verification of documentation and CE marking are importer and distributor obligations.
Tip 6: Know the FRIA and DPIA Connection
Deployers of certain high-risk AI systems (especially public bodies and entities providing essential services) must carry out a Fundamental Rights Impact Assessment. The DPIA obligation comes from the GDPR. Both of these are deployer-side obligations, not provider-side.
Tip 7: Use Process of Elimination
If a question asks which entity is responsible for a specific obligation and you are unsure, eliminate roles that clearly do not fit. For instance, a distributor would never be responsible for conducting a conformity assessment unless they have modified the system and become a provider.
Tip 8: Distinguish Between Placing on the Market and Putting into Service
"Placing on the market" means making an AI system available for the first time on the EU market. "Putting into service" means supplying the AI system for first use. A provider may do either or both. A deployer typically puts the system into service (uses it). Understanding this distinction helps in correctly answering questions about when obligations are triggered.
Tip 9: Consider Multi-Role Scenarios
An entity can occupy more than one role simultaneously. For example, a company that develops an AI system and also uses it internally is both a provider and a deployer. In such cases, the entity must fulfill obligations associated with both roles.
Tip 10: Remember the Product Safety Heritage
The EU AI Act's role framework is modeled on the EU's New Legislative Framework for product safety. If you are familiar with roles like manufacturer, importer, distributor, and end-user in product safety law, you can draw parallels. The provider maps to manufacturer, the deployer maps to the professional user, and importer and distributor retain their traditional meanings.
Summary
The roles under AI laws create a structured framework of accountability across the AI value chain. Providers bear the primary burden as system creators. Deployers are accountable for responsible use. Importers ensure that non-EU systems meet EU requirements before entering the market. Distributors exercise due diligence in making AI systems available. Understanding these roles, their obligations, and the conditions under which roles shift is essential for effective AI governance and for successfully answering exam questions on this topic.
Go Premium
Artificial Intelligence Governance Professional Preparation Package (2025)
- 3360 Superior-grade Artificial Intelligence Governance Professional practice questions.
- Accelerated Mastery: Deep dive into critical topics to fast-track your mastery.
- Unlock Effortless AIGP preparation: 5 full exams.
- 100% Satisfaction Guaranteed: Full refund with no questions if unsatisfied.
- Bonus: If you upgrade now you get upgraded access to all courses
- Risk-Free Decision: Start with a 7-day free trial - get premium features at no cost!