12 AI Compliance Checks Every European SME Should Run Each Quarter
A 12-point quarterly AI compliance monitoring checklist for European SMEs. Covers GDPR, EU AI Act deployer obligations, vendor review, and staff oversight…
TL;DR: A 12-point quarterly AI compliance monitoring checklist for European SMEs. Covers GDPR, EU AI Act deployer obligations, vendor review, and staff oversight…
Compliance monitoring is the step that turns a governance framework from a document into a live system. Without it, your AI register gets stale, vendor changes go unnoticed, and the gap between your policy and how AI is actually being used in your business quietly widens.
This checklist is designed for operations leaders, CTOs, and COOs who need to run a quarterly AI compliance review without a dedicated compliance function. Each check takes five to fifteen minutes to complete. The full set should take under three hours, and can be distributed across tool owners to reduce individual time commitment further.
Run this checklist on the first Monday of each quarter. The output is a completed checklist with any failed checks logged as action items for the following 30 days.
Before You Start
You will need access to: your AI register, Data Processing Agreements for all active AI tools, your AI use policy, any incident logs from the past quarter, and the most recent monthly governance review records.
If you do not have an AI register yet, the AI governance framework for European SMEs gives you the structure to build one before running this checklist.
Section A: Register and Inventory Integrity (Checks 1–3)
Check 1: AI register is complete and current
Open your AI register. For every active entry: confirm the tool is still in use, the owner is a current employee, and the last review date is within the past 30 days (reflecting monthly reviews). Confirm there are no known tools in use that are not in the register.
Pass criteria: Every active tool has an owner, a risk classification, and a review date within the past 30 days. No unregistered tools are in active use.
Action if failed: Add missing tools, reassign ownership gaps, and schedule overdue reviews before closing the quarter.
Check 2: Shadow AI assessment
Ask each team lead: are any team members in your area using AI tools not on the approved list? Cross-reference with IT system logs if available, and with the monthly review records from the past quarter.
Pass criteria: No unapproved AI tools in active use, or any discovered tools are in the escalation process.
Action if failed: Apply the shadow AI escalation framework to each unapproved tool. Determine: approve, replace, or prohibit.
Check 3: Use case drift review
For each tool in the register, confirm the current use case matches the approved use case. Tools expand in scope over time — a document summarisation tool starts being used for customer communication drafts; a coding assistant starts being used to process business logic involving personal data.
Pass criteria: Every tool's current use case matches its approved purpose in the register.
Action if failed: If use case has expanded, re-run risk classification and DPA review for the new use case before allowing continued use.
Section B: Data Compliance Checks (Checks 4–6)
Check 4: Data Processing Agreements are current
For every active AI tool that processes personal data, confirm: the DPA is signed, still in force, and covers the current use case. Vendor mergers, acquisitions, and pricing changes sometimes trigger updated terms that require DPA renegotiation.
Pass criteria: Every tool processing personal data has a current, signed DPA.
Action if failed: Contact vendor immediately. Suspend personal data processing through the tool until DPA is confirmed in place.
Check 5: Data residency is confirmed
For each tool processing personal data, confirm data is being processed in an EU/EEA region, or that adequate transfer mechanisms (SCCs, EU-US Data Privacy Framework) are in place for cross-border processing.
Pass criteria: Data residency is confirmed and documented for every tool processing personal data.
Action if failed: Obtain written confirmation from vendor of data processing location. If data is being processed outside the EU/EEA without adequate mechanisms, suspend the use case and consult legal counsel.
Check 6: No personal data in unapproved contexts
Review the past quarter's incident log and any staff-reported cases for instances where personal data was processed through a tool without a current DPA, or in a use case not covered by the DPA.
Pass criteria: No confirmed cases of personal data processed outside approved tools and use cases. Any near-misses are logged and have corrective actions in place.
Action if failed: Assess whether the incident constitutes a GDPR breach. Personal data breaches must be reported to the relevant supervisory authority within 72 hours of the organisation becoming aware. Do not delay this assessment.
Section C: EU AI Act Compliance Checks (Checks 7–9)
Check 7: High-risk system inventory and documentation
Identify every AI tool in your register classified as high-risk under Annex III of the EU AI Act. For each, confirm: a fundamental rights impact assessment has been completed, a human oversight mechanism is in place and documented, and logs of system use are being maintained.
Pass criteria: Every high-risk system has complete documentation. If no high-risk systems are in the register, this check passes.
Action if failed: Complete the missing documentation before the next quarterly review. Consult legal counsel if you are uncertain whether a system meets the high-risk threshold.
Check 8: Transparency obligations are being met
Review any customer-facing AI deployments — chatbots, automated response systems, AI-generated content sent externally. Confirm that disclosure of AI origin is in place where required under EU AI Act Article 50.
Pass criteria: Every customer-facing AI deployment includes appropriate disclosure. No AI-generated content is presented as human-generated in contexts where that distinction is material.
Action if failed: Update the relevant interface, template, or workflow to include disclosure. Log the correction date.
Check 9: Risk classifications are still accurate
For every tool in the register, confirm the current EU AI Act risk classification reflects the current use case. Use case changes can shift a tool from minimal to limited or high risk. Regulatory guidance updates can also change classifications.
Pass criteria: Every tool's classification reflects its current use case and the current regulatory guidance.
Action if failed: Re-classify affected tools. For tools moving into high-risk, complete the required documentation before allowing continued use.
Section D: Vendor and Contract Health (Checks 10–12)
Check 10: Vendor change review
Review any notices received from AI tool vendors in the past quarter — pricing changes, terms updates, product changes, ownership changes. Confirm that material changes have been assessed for compliance impact and that any required contract renegotiation is in progress.
Pass criteria: No unreviewed vendor change notices. Any material changes have been assessed and actioned.
Action if failed: Prioritise review of outstanding notices. If a vendor has been acquired, confirm new data processing terms before the next monthly review.
Check 11: Contract renewal and exit terms review
For each active AI tool contract expiring within the next six months, review: data export rights, exit provisions, and whether the tool's performance still justifies renewal. Confirm you can export your data in a usable format if you choose not to renew.
Pass criteria: All upcoming renewals are on the radar. Data export capability is confirmed for tools approaching end of contract.
Action if failed: Contact vendors for contracts where data export terms are unclear. Negotiate export rights before renewal if they are not in the current contract.
Check 12: Staff training currency
Confirm all staff with access to AI tools have completed the required AI use policy training within the past 12 months. Check for new hires who may not have completed onboarding training, and for staff who have changed roles and taken on new AI tool access.
Pass criteria: Every staff member with AI tool access has completed current training. No more than 10% of users are overdue for renewal.
Action if failed: Schedule make-up training for overdue staff. Make it a condition of continued tool access. Update onboarding to prevent gaps for new hires.
After the Checklist
Complete a quarterly compliance record summarising: checks completed, pass/fail status for each, action items with owners and deadlines, and any escalations to the leadership team.
Escalate immediately — do not wait for the next quarterly cycle — if any of the following are found: confirmed personal data processed without a DPA, an unreviewed vendor acquisition notice with GDPR implications, or a high-risk AI system operating without required documentation.
The quarterly record is your primary audit evidence. Under the EU AI Act, the burden is on deployers to demonstrate compliance — the checklist record is how you meet that burden without a dedicated compliance function.
For the monthly review cadence that sits between quarterly checks, see the monthly AI governance review template.
Frequently Asked Questions
How long does the full quarterly checklist take?
Approximately two to three hours if distributed across tool owners. If a single person runs all 12 checks, allow three to four hours. Reduce time by ensuring monthly review records are up to date — a well-maintained monthly record makes Checks 1, 3, 6, and 12 faster, since you are confirming quarterly patterns rather than investigating from scratch.
What if we fail multiple checks in the same quarter?
Prioritise by risk: data compliance failures (Checks 4–6) and EU AI Act documentation failures for high-risk systems (Check 7) take priority over administrative gaps. Address data compliance and high-risk system issues within 30 days. Administrative gaps (ownership, training currency, register completeness) should be resolved before the next quarterly review.
Do we need to share this checklist with anyone outside the organisation?
Not routinely. The completed checklist is internal governance documentation. You would share it with a national supervisory authority if requested during an investigation, with enterprise customers who conduct vendor AI governance audits, or with cyber insurance providers who include AI governance in their policy conditions. Keep completed checklists for a minimum of three years.
Can this checklist substitute for legal advice on EU AI Act compliance?
No. The checklist gives you an operational monitoring framework — it identifies gaps and creates an evidence trail. It does not substitute for legal advice on whether specific tools require conformity assessments, whether your fundamental rights impact assessments are sufficient, or how to handle a confirmed breach. Use this checklist to identify issues; use your legal counsel to resolve them.
Further Reading
- AI Governance Framework for European SMEs 2026 — the full governance structure this checklist monitors
- Monthly AI Governance Review Template for SMEs — the monthly cadence between quarterly audits
- AI Tool Selection Scorecard for European SMEs — use before adding new tools to reduce compliance gaps at source
- Shadow AI Escalation Framework for European SMEs — decision path when Check 2 surfaces unapproved tools
Want support running your first quarterly review? Start with a free AI readiness assessment to identify your current compliance gaps before the audit.

