Skip to main content

Command Palette

Search for a command to run...

EU AI Act August 2026: What European SMEs Must Do Before the GP Systems Deadline

The EU AI Act August 2026 deadline for general-purpose AI systems is 4 months away. The practical compliance checklist every European SME needs.

Updated
10 min read
EU AI Act August 2026: What European SMEs Must Do Before the GP Systems Deadline
D
PhD in Computational Linguistics. I build the operating systems for responsible AI. Founder of First AI Movers, helping companies move from "experimentation" to "governance and scale." Writing about the intersection of code, policy (EU AI Act), and automation.

TL;DR: The EU AI Act August 2026 deadline for general-purpose AI systems is 4 months away. The practical compliance checklist every European SME needs.

The EU AI Act August 2026 deadline is closer than most operations directors and compliance leads realise. On 2 August 2026, obligations tied to general-purpose AI (GPAI) systems become enforceable across the European Union. For a 25-person HR software firm in Munich using OpenAI's API to generate candidate summaries, or a 40-person professional services firm in Amsterdam whose client portal surfaces AI-written reports, the question is no longer theoretical: are you ready?

The good news is that the compliance burden for most growing software companies and founder-led businesses falls well short of what the headlines suggest. You almost certainly do not need to train a foundation model or navigate the full weight of Article 53. What you do need is a documented, transparent approach to the GPAI tools already running in your stack. This article tells you exactly what that looks like.

What the August 2026 Deadline Actually Means

The EU AI Act (Regulation 2024/1689) entered into force in August 2024. Obligations rolled out in phases. The August 2026 deadline activates the GPAI chapter, specifically the rules governing general-purpose AI models and systems.

Two sets of obligations matter here, and mixing them up is the most common mistake a legal lead or CTO makes:

Article 53 obligations apply to GPAI model providers. These are the organisations that train, develop, and place a general-purpose AI model on the EU market. Think the companies behind large language models, image generation engines, or foundation models sold as API services. If your company is not in that business, Article 53 is not your primary concern.

Article 50 transparency obligations apply to deployers. A deployer, under Article 3 of the EU AI Act, is any natural or legal person that puts an AI system into use under their own authority. If your mid-sized operations team is integrating a GPAI model into a customer-facing product, an internal workflow tool, or a document generation pipeline, you are a deployer. Article 50 is your framework.

For most European SMEs, the practical question is: what do deployers owe their users and regulators by August 2026?

Which SMEs Are Affected as Deployers

The deployer category is broader than many compliance officers expect. You are a deployer if you:

  • Integrate a GPAI API (such as a large language model or image generator) into a product or service
  • Use an AI-powered SaaS tool where your configuration choices materially shape the AI's output
  • Publish AI-generated or AI-assisted content to end users without disclosure

Size does not create an exemption. A 15-person legal technology startup in Warsaw that uses an LLM to draft contract summaries for clients is a deployer with Article 50 obligations. A 50-person marketing agency in Barcelona that generates client-facing copy via an AI writing tool and publishes it without labelling is a deployer with a disclosure gap.

The regulation does include a proportionality principle: obligations must be applied in a manner proportionate to the size and nature of the entity. However, proportionality reduces administrative burden, not the core transparency duties.

The 6-Point Compliance Checklist for SME Deployers

This checklist addresses the practical actions a compliance officer or CTO at a 15-to-50-person European company needs to complete before August 2026.

1. Add Transparency Notices for AI-Generated Content

Article 50 requires deployers to ensure that users interacting with AI systems, or receiving AI-generated content, are informed of that fact. If your product surfaces AI-generated text, images, audio, or video to end users, a disclosure is required. The disclosure must be clear, prominent, and presented at the point of interaction, not buried in terms and conditions.

For a growing software company, this typically means an interface label ("This summary was generated by AI"), a tooltip, or a notice in the document header. For a professional services firm sending AI-drafted reports to clients, a standard footer disclosure or cover-page notice satisfies the requirement.

2. Register Which GPAI Models You Use

Before you can document compliance, you need an inventory. Build a register of every GPAI model your organisation uses: the model name and version, the provider, the use case, the user population affected, and whether the output is customer-facing or internal only.

This register is foundational. It feeds your vendor contract review (point 4), your human oversight documentation (point 3), and your response to any regulatory inquiry. A spreadsheet is sufficient for most founder-led businesses at this stage. See our AI Governance Committee Charter for European SMEs for a governance structure that formalises this register without overengineering it.

3. Document Your Human Oversight Arrangements

Article 14 of the EU AI Act requires that high-risk AI systems be designed and deployed with appropriate human oversight. For many GPAI deployments, the risk classification will not reach the high-risk threshold. However, documenting your oversight approach regardless of classification is a prudent practice that regulators and auditors will expect.

For each AI-assisted workflow, record: who reviews AI outputs before they reach the end user, what criteria govern when a human intervenes or overrides, and how errors or anomalies are reported. For an operations director running an AI-assisted invoice processing workflow, this might be a two-line policy in your internal handbook.

4. Review Vendor Contracts for Article 25 Obligations

Article 25 of the EU AI Act establishes that deployers who are also providers of high-risk AI systems in certain respects take on additional responsibilities. More practically for SMEs, your vendor contracts need to reflect the compliance chain.

Check whether your GPAI vendor's terms of service include: confirmation of their own compliance posture under the EU AI Act, data processing terms consistent with GDPR, and any limitations on use cases that would affect your compliance status. A vendor who cannot confirm their Article 53 compliance status by August 2026 is a vendor who introduces regulatory risk into your stack.

Use our AI Vendor Evaluation Scorecard to structure this review.

5. Apply Watermarking and Disclosure for Public-Facing AI Content

Article 50 includes specific provisions for AI-generated synthetic content. If your company publishes AI-generated images, video, or audio that could be mistaken for authentic human-created material, machine-readable disclosure (watermarking or metadata tagging) is required where technically feasible.

For most professional services firms and software companies, the relevant scenario is AI-generated images used in marketing materials or client deliverables, and AI-written text published to public websites or sent to clients. The disclosure obligation for text is met through labelling. For images and audio, check whether your generation tool produces embedded metadata or watermarks, and document that capability in your compliance record.

6. Establish Data Governance for Training Data Use

If your company fine-tunes or customises a GPAI model using your own data, including customer data, you enter a different compliance zone. Ensure that any training data use is covered by a lawful basis under GDPR, that data minimisation principles apply, and that you have documented consent or legitimate interest assessments where required.

For companies using off-the-shelf GPAI APIs with no fine-tuning, this point is lower priority. For a mid-sized operations team building proprietary AI capabilities on top of customer data, it is the most significant compliance gap to close.

What Most SMEs Are NOT Required to Do

The EU AI Act media coverage focuses heavily on obligations that do not apply to the vast majority of European SMEs. To be precise:

  • You do not need to comply with Article 53 unless you develop and place a GPAI model on the market.
  • You do not need to publish technical documentation about model training, compute thresholds, or systemic risk assessments unless you are a model provider.
  • You do not need to register your company or products in the EU database unless you deploy a high-risk AI system as defined in Annex III of the regulation.
  • You do not need a dedicated AI compliance team. A documented policy owned by a single named person is sufficient for most founder-led businesses and growing software companies under 50 people.

Conflating provider obligations with deployer obligations leads compliance leads to over-engineer their response or, worse, dismiss the regulation entirely because the full Article 53 framework feels disproportionate. Neither outcome serves the business.

The Safe Harbour Path: Three Steps That Cover Most SMEs

For the majority of European SMEs deploying GPAI tools, compliance resolves to a three-step approach:

First, use a vendor who can confirm their own EU AI Act compliance posture. A model provider who has done their Article 53 work creates a cleaner compliance chain for you as a deployer. Ask for a compliance summary or check their public documentation.

Second, document your usage. The GPAI register (point 2 above) combined with your human oversight notes (point 3 above) constitutes a deployable compliance record. This does not need to be a lengthy document. It needs to be accurate and current.

Third, add transparency notices. This is the visible, user-facing part of compliance. A disclosure label on AI-generated content satisfies the core Article 50 obligation and demonstrates good-faith compliance to regulators.

For a detailed implementation timeline mapped to the August 2026 deadline, see our EU AI Act August 2026 Deadline Action Plan for SMEs.

If your compliance posture needs a structured review before August, speak with our AI governance advisory team. We work specifically with European SMEs navigating the practical implications of the EU AI Act.

FAQ

Does the EU AI Act August 2026 deadline apply to my company if we only use off-the-shelf AI tools?

Yes, if those tools are integrated into your products or services and produce outputs that reach end users, you are a deployer under the EU AI Act. The August 2026 deadline activates Article 50 transparency obligations for deployers, regardless of whether you developed the underlying AI model yourself. The practical requirement is disclosure: inform users when they are interacting with AI-generated content.

What is the difference between a GPAI provider and a deployer under the EU AI Act?

A GPAI provider is an organisation that trains and places a general-purpose AI model on the EU market. Article 53 applies to providers and covers technical documentation, copyright transparency, and systemic risk assessments for the most capable models. A deployer is any organisation that puts an AI system into use, including by integrating a third-party model into a product. Most European SMEs are deployers, not providers, and their primary obligations fall under Article 50 and the general deployer duties in Article 26.

What happens if we miss the August 2026 deadline?

The EU AI Act provides national market surveillance authorities with enforcement powers, including the ability to issue fines. For GPAI-related violations, fines can reach 3% of global annual turnover or 15 million euros, whichever is higher, for providers. For deployers, enforcement will focus on transparency failures and non-compliance with deployer obligations. Beyond regulatory risk, missing the deadline creates a commercial exposure: enterprise customers and public-sector clients will increasingly require a documented AI compliance posture as a procurement condition.

Further Reading