Who's responsible under the EU AI Act

The EU AI Act (Regulation 2024/1689) splits obligations across four roles. If your enterprise uses AI agents, you are almost certainly a deployer — and that's where most of the work sits.

⚠ Our reading, not legal advice

This page reflects Agentic Thinking's interpretation of the EU AI Act. The Act is new and evolving.

Before acting on anything here, your organisation should:

  1. Engage qualified EU-law counsel to determine which articles apply to your deployment
  2. Have counsel draft compliance requirements per article for your use case
  3. Document operationally how your organisation will meet each requirement
  4. Review periodically as Commission guidance and case law develop

Agentic Thinking Ltd is an infrastructure vendor, not a law firm. No solicitor-client relationship arises from reading this page.

The four roles

RoleWhat it meansExample
GPAI provider Builds the underlying AI model Anthropic (Claude), OpenAI (GPT), Moonshot (Kimi)
High-risk AI system provider Builds or sells an AI system for a high-risk use A vendor selling an AI credit-scoring system
Deployer Uses an AI system in its own operations A bank using that credit-scoring system
Importer / Distributor Brings a non-EU AI product into the EU market A UK firm reselling a US AI product into the EU

What each owes

GPAI providers
Technical documentation, copyright policy, training-data summary, downstream information (Art. 53).
Applies from 2 Aug 2025
High-risk AI system providers
Risk management, automatic logging, human-oversight design, conformity assessment, post-market monitoring, serious-incident reporting (Arts. 9–15, 19, 43, 72, 73).
Applies from 2 Aug 2026
Deployers
Follow vendor instructions, assign a trained human overseer, monitor operation, retain logs ≥ 6 months, inform affected people, cooperate with regulators (Art. 26).
Applies from 2 Aug 2026

When a deployer becomes a provider

⚠ Watch out — Art. 25

A deployer can become a provider by putting their name on a high-risk system, making substantial modifications to it, or repurposing a general AI model specifically for a high-risk task. When that happens, the full provider obligation stack attaches to the deployer.

Fines

€35M / 7%
of worldwide annual turnover — prohibited practices (Art. 5)
€15M / 3%
of worldwide annual turnover — most other breaches (Art. 99)
Our reading of Regulation (EU) 2024/1689. Not legal advice. Get EU-qualified counsel before acting on this. · Last reviewed 2026-04-23 · Report a correction