25 APRIL 2026 · COMPLIANCE

EU AI Act compliance, built into the platform

Article 12 of the EU AI Act takes effect on the second of August, 2026. Every provider of a high-risk AI system has to keep automatic event logs throughout the system's lifetime. Two minutes below: the articles we cover, and how.

2 MIN 18 SEC · ARTICLES 12, 9, 13, 14, 15, 26 · HookBus Enterprise Suite

Why this video, and why now

The 2 August 2026 deadline is becoming the regulator's anchor in every EU enterprise procurement conversation about AI agents. CISOs are asking the same question across banking, insurance, defence, and the public sector: which articles do you actually deliver evidence for, and how?

So we filmed the answer. Article by article, what the HookBus Enterprise Suite produces. No "compliance theatre". No claim that buying our product makes your organisation compliant. Compliance is your programme. Our job is to produce the evidence the act requires, in the format your assessor expects.

Articles we stake

ARTICLE 12 · in force 2 August 2026
Record-keeping. Automatic event logging throughout system lifetime.
Auditor subscriber: SHA-256 hash-chained log of every agent action HookBus sees. Tamper-evident, exportable, the format your assessor expects.
ARTICLE 9
Risk management system, documented and operating across the lifecycle.
Same evidence pack: versioned protocol, reasoned decision logs, every block, every allow, every override recorded with the reason.
ARTICLE 13
Transparency of operation. Instructions and reasoning behind decisions.
Same evidence pack: the decision logs surface why an action was allowed, denied, or asked.
ARTICLE 14
Human oversight of the AI system. Capacity to intervene.
CRE-AgentProtect ask-mode + PIN override: the agent pauses, a human approves, the audit trail names who approved what and when.
ARTICLE 15
Accuracy, robustness, cybersecurity. Bounded behaviour, secure operation.
DLP Filter + local L1/L2: regulated data never leaves the agent boundary. Layer 1 is deterministic. Layer 2 is the LLM of your choice (Granite, Bedrock, Watsonx, Vertex, Azure OpenAI, or fully on-prem). Inference stays inside your boundary.
ARTICLE 26 · deployer
Monitor the AI system in operation. Maintain the audit trail.
Same evidence pack supports the deployer obligation. The product the provider buys is the same product the deployer monitors.

What we don't claim

We do not stake compliance to articles where we do not deliver direct evidence. Specifically:

Coverage of the articles we do stake depends on what your AI runtime exposes through its lifecycle hooks. We are leading the work with AI providers to ensure those hooks deliver what the act requires. Our reading of the four-role split (provider, deployer, importer, distributor) is on the EU AI Act page.

Book a 30-minute walkthrough

If your organisation provides or deploys AI agents and you want to walk article-by-article through what HookBus produces against your specific use case, book a demo.

Book a demo →
← Back to all posts