Skip to main content

Command Palette

Search for a command to run...

EU AI Act in Force: The Operational Checklist for Dutch SME Managers

Updated
12 min read
EU AI Act in Force: The Operational Checklist for Dutch SME Managers

TL;DR: The EU AI Act enforcement phase started January 2026. Dutch SME managers need a practical checklist — not a legal briefing. Here is what to audit, fix, an…

The EU AI Act moved into its enforcement phase in January 2026. For most Dutch SMEs, the public conversation around the Act has been dominated by legal briefings and high-level summaries that are accurate but not operational. The question for a manager running a 20-50 person company is not "what does the Act say" but "what do we actually need to do."

This checklist is structured around that question.


What the EU AI Act Actually Requires of SMEs

The Act is risk-based. The obligations it creates depend on how your company relates to AI systems:

  • Provider: You build or significantly modify an AI system for the market
  • Deployer (Operator): You use an AI system in your business operations
  • Distributor or importer: You resell or import AI systems

Most Dutch SMEs are deployers. They use AI tools built by others — AI assistants, document processing tools, customer-facing chatbots, recommendation systems — in their operations. The Act's obligations for deployers are less demanding than for providers, but they are real.


Phase 1 Obligations (Active Since February 2025)

The provisions with the longest compliance timeline — those that have been active since February 2025 — cover prohibited AI practices. If your company uses any of the following, you are already in a legal grey zone:

  • AI systems that use subliminal techniques to manipulate behaviour in ways harmful to users
  • AI systems that exploit vulnerabilities of specific groups (age, disability) to manipulate decisions
  • Social scoring systems that evaluate or classify individuals based on personal characteristics
  • Real-time biometric identification in publicly accessible spaces (with narrow exceptions)

For most SMEs, these provisions are not directly relevant to your current tool stack. But they are worth checking against any AI-assisted marketing tools, customer profiling systems, or access control systems you use.


The Deployer Checklist for Dutch SMEs

Work through this checklist in sequence. Each item should be owned by a named person in your organisation.

Step 1: Inventory Your AI Systems

List every AI system your company uses in operations. Include:

  • AI tools purchased as standalone software
  • AI features embedded in SaaS platforms (AI-assisted support in CRM, AI writing in document tools, AI scheduling in HR software)
  • AI tools used by individual team members under personal or team subscriptions
  • Any custom-built or fine-tuned models

Most Dutch SMEs find their AI system inventory is larger than expected once embedded features in existing platforms are included.

Owner: CTO, IT lead, or operations manager
Timeline: Complete within 30 days


Step 2: Classify Each System by Risk Category

For each system in your inventory, determine its classification under the Act:

Unacceptable risk: Prohibited systems (see Phase 1 above). Should not be in use.

High risk: Systems listed in Annex III of the Act. For Dutch SMEs, the most relevant high-risk categories are:

  • AI in recruitment and employment decisions (CV screening, performance assessment, work allocation)
  • AI in credit or insurance scoring
  • AI in educational assessment
  • AI in access to essential services

Limited risk: AI systems with specific transparency obligations — chatbots that interact with users must identify themselves as AI; deepfakes and AI-generated content require disclosure in certain contexts.

Minimal risk: The majority of AI tools (spam filters, AI writing assistants, recommendation engines in internal tools) fall here. No specific obligations beyond general data protection law.

Action: For every high-risk system you identify, proceed to Step 3. For limited-risk systems, proceed to Step 4.


Step 3: High-Risk System Obligations

If you are deploying a high-risk AI system, the Act requires:

Human oversight mechanism: A defined process by which a qualified human can review, override, or stop the AI system's decisions. This must be documented, not just assumed.

Technical documentation: The vendor should provide technical documentation for the system. Confirm you have received and retained it.

Logging: High-risk systems must maintain logs of operation sufficient to enable post-hoc review of decisions. Confirm the system logs are retained appropriately.

Registration: High-risk AI systems used by deployers in regulated sectors must be registered in the EU AI Act database. Check whether your system requires registration.

Staff training: Persons operating or overseeing a high-risk AI system must have adequate training. Document what training has occurred and when.

Action: For each high-risk system, document the human oversight mechanism, confirm technical documentation receipt, verify logging is in place, check registration requirements, and record training completion.


Step 4: Limited-Risk Transparency Obligations

If you deploy a customer-facing chatbot or AI assistant, users must be informed they are interacting with an AI system. This does not need to be prominent — a clear disclosure at the start of the interaction is sufficient.

If you generate or distribute AI-generated content (images, video, audio, text intended as factual) at scale, disclosure requirements apply. Check whether your use cases fall under these provisions.

Action: Review any customer-facing AI interactions for disclosure compliance. Add disclosure language where absent.


Step 5: Vendor Compliance Due Diligence

As a deployer, you are responsible for ensuring the AI systems you use are compliant at the provider level. Practically, this means:

  • Request the vendor's EU AI Act compliance documentation: For any system in the limited-risk or high-risk category, the vendor should be able to provide a compliance summary or CE mark for the system.
  • Check your contracts: Vendor agreements for AI tools should include provisions for Act compliance. If your current contracts predate the Act, they may not include these provisions.
  • Document vendor representations: Keep records of vendor compliance claims. If a vendor's system later proves non-compliant, your documentation of due diligence affects your exposure.

Step 6: GDPR Intersection

The EU AI Act operates alongside GDPR, not instead of it. AI systems that process personal data must comply with both. The key intersection points:

  • Automated decision-making: GDPR Article 22 gives individuals the right not to be subject to solely automated decisions with significant effects. If your AI systems make decisions about individuals — employees, customers, applicants — assess whether human review is required under GDPR as well as the Act.
  • Data minimisation: AI systems trained or fine-tuned on personal data must comply with data minimisation principles. Confirm that any personal data used in AI tool contexts is proportionate to the purpose.
  • Records of processing: AI systems that process personal data should be documented in your Records of Processing Activities (RoPA) under GDPR Article 30.

Step 7: Governance Record

Document the following and review quarterly:

  1. AI system inventory (current as of date)
  2. Risk classification for each system
  3. High-risk system compliance actions (oversight, logging, training)
  4. Limited-risk disclosure status
  5. Vendor compliance documentation received
  6. Named governance owner for AI compliance
  7. Next review date

This record serves three purposes: internal accountability, investor due diligence, and regulatory inquiry response. An SME that can produce this record in response to a regulatory question is in a fundamentally better position than one that cannot.


The 90-Day Action Plan

Days 1-30: Complete the AI system inventory. Assign a governance owner.

Days 31-60: Classify each system. Identify any high-risk systems and initiate Step 3 actions. Request vendor compliance documentation for all non-minimal risk systems.

Days 61-90: Complete high-risk system documentation. Update vendor contracts where needed. Implement limited-risk disclosure where absent. Establish the governance record.

Ongoing: Quarterly review of the inventory for new systems. Annual review of vendor compliance documentation.


What This Does Not Replace

This checklist is an operational starting point, not legal advice. For high-risk AI systems in regulated sectors — financial services, healthcare, employment — engagement with legal counsel familiar with the Act is appropriate. The checklist gives you the operational foundation to have that conversation productively.

Talk to us about EU AI Act compliance for your Dutch SME →

Start with an AI readiness assessment →

Frequently Asked Questions

When did the EU AI Act start applying to Dutch companies?

The EU AI Act applies in phases. Provisions on prohibited AI practices applied from February 2025. High-risk AI system obligations and transparency requirements for general purpose AI entered enforcement from August 2025 onward. The full enforcement regime is active in 2026. Dutch companies should treat all provisions as currently applicable.

What are the main EU AI Act obligations for a Dutch SME that uses AI tools?

As a deployer, your primary obligations are: maintaining an inventory of AI systems you use, classifying them by risk category, implementing oversight and logging for any high-risk systems, ensuring transparency for customer-facing AI, and maintaining vendor compliance documentation.

Does a Dutch SME face fines under the EU AI Act?

Yes. The Act establishes fines that can reach up to 3-7 percent of global annual turnover for serious violations (prohibited systems, misrepresentation to authorities). For deployers, the immediate risk is less about fines and more about regulatory inquiry response — being unable to demonstrate compliance when asked.

Do AI features inside SaaS platforms count under the EU AI Act?

Yes. AI features embedded in SaaS platforms you use are AI systems for the purposes of the Act. You are the deployer. The vendor is the provider. Both have obligations, but your compliance position as a deployer includes understanding what the tool does and whether it falls into a regulated risk category.

What is the difference between the EU AI Act and GDPR for AI compliance?

GDPR governs personal data processing — applicable to AI systems that process personal data, regardless of risk classification. The EU AI Act governs AI systems by risk category — applicable to all AI systems, not only those processing personal data. Both apply simultaneously; compliance with one does not substitute for compliance with the other.

Read Further

3 views

More from this blog

F

First AI Movers Radar

705 posts

The real-time intelligence stream of First AI Movers. Dr. Hernani Costa curates breaking AI signals, rapid tool reviews, and strategic notes. For our deep-dive daily articles, visit firstaimovers.com.