DOXIA AXIS
BOOK
← ALL INSIGHTS
Regulation08 Apr 20263 min read

The EU AI Act's GPAI rules are in force. Your procurement timeline just moved.

The General-Purpose AI obligations under the EU AI Act took effect on 2 August 2025. Most operators still haven't traced the contractual consequences through their AI stack. Here's the practical read-out.

The headline

The General-Purpose AI (GPAI) provisions of the EU AI Act entered into force on 2 August 2025. Unlike the high-risk-system obligations — which phase in gradually — GPAI duties are live today for any provider of a general-purpose AI model placing it on the EU market.

If you use a frontier model in a product that touches an EU user, your vendor is now inside the scope of the Act. That changes what your vendor is required to give you — and by extension, what you should be asking for in your contracts.

What GPAI providers must now do

The live obligations:

  • Technical documentation for the model, made available to downstream deployers and the AI Office on request.
  • Summary of training content — the provider must publish a "sufficiently detailed summary" of the data used to train the model.
  • Copyright compliance policy — a written policy demonstrating compliance with EU copyright law, including opt-outs under the DSM directive.
  • Systemic-risk obligations for models above a compute threshold (≥10²⁵ FLOPs): incident reporting, adversarial testing, cybersecurity, post-market monitoring.

The Code of Practice — published in July 2025 — is the instrument providers can sign to signal compliance. Most US frontier labs have signed. Meta has not.

What this means for your procurement

Three things shift for any business using GPAI models inside EU-facing products:

1. You can now demand the training-data summary

It's no longer proprietary. Vendors can't refuse to disclose it. If your compliance or risk function asks for it and your vendor stonewalls, that's a flag.

2. Your DPA needs a GPAI-specific addendum

Standard data processing agreements don't cover the new obligations. You want your vendor to contractually confirm:

  • Technical documentation is maintained and made available to you on request
  • The copyright-compliance policy is in force and applies to the model version you use
  • Incident reporting under Article 55 will flow downstream to you within a defined window

3. Your own AI use policy inherits the obligations

If you deploy a GPAI model into a "high-risk" use case under Annex III, you become an AI Act deployer — with your own transparency, oversight, and logging duties. Most operators we audit have not yet classified their use cases. This is the first gap we flag in every legal lane of a Free First Audit.

Enforcement — the actual pressure

Here's the uncomfortable part: the enforcement machinery (AI Office + national market-surveillance authorities) is still staffing up. Material fines are not expected in 2026. But the obligations are legal obligations from today — and the first few enforcement actions will define the bar.

We're tracking that actively. Expect the pressure to start landing somewhere between Q3 2026 (first Code-of-Practice audit wave) and Q1 2027 (first material fines).

What to do this quarter

  1. Map your GPAI stack. Which models, from which vendors, in which product surfaces, touching which users. Most operators don't have this map.
  2. Refresh your vendor paper. Two clauses added: GPAI obligations confirmation, and a GPAI-specific breach-notification window.
  3. Classify your use cases against Annex III. If any are high-risk, you have deployer obligations kicking in — not future, today.
  4. Publish a public AI use disclosure if you use GPAI models to generate customer-facing content. Article 50 transparency duties are live.

If this was useful, a Free First Audit includes a Legal & Compliance lane that produces exactly this map for your business — in three to five days. Start here.