Does the EU AI Act Apply to Your Company? Find Out in 3 Minutes.
Your boss just asked what the AI Act means for your company. You googled it. You found a 180-page regulation. Now you need to say something useful in next meeting.
Short version: if your organisation uses AI tools in the EU, the AI Act almost certainly applies to you. That includes companies using ChatGPT, third-party hiring tools, or any AI-powered SaaS. Most companies are behind. The quiz tells you by how much.
This assessment covers the 10 areas that matter most. It takes 3 minutes. You get a score out of 30, a breakdown of every gap, and specific next steps.
Built for the person who got handed "AI Act" as an extra responsibility
You're probably a DPO, IT manager, compliance coordinator, or operations lead at a company with 50 to 5,000 employees. You understand your business but you're not a regulatory lawyer. You need to walk into a meeting on Thursday and say something more useful than "we should probably look into this."
This quiz gives you that. A score you can show your CEO and a gap list that tells you where to start. If you work with clients on compliance, you can share the quiz as a starting point for conversations. Several consultants already do.
Does the EU AI Act Apply to You?
Take This 3-Minute Check.
The deadline is approaching. Most companies haven't started. This assessment tells you where you stand and what to do first.
10 questions. 3 minutes. Free.
Score by area
Your top gaps
Your score won't improve on its own
Every Tuesday we break down one EU regulation in plain language. What changed, who it affects, what to do. Subscribers also get the AI Act Compliance Checklist.
Free. Unsubscribe anytime. Privacy policy.
Does the AI Act apply if we just use third-party AI tools?
Yes. This is the most common misconception. The AI Act creates separate obligations for "deployers" — organisations that use AI systems, even if they didn't build them. If your HR team uses an AI recruitment tool, your customer service runs an AI chatbot, or your finance team uses AI-powered analytics, your company carries deployer obligations. These include transparency (telling people they're interacting with AI), human oversight (someone must be able to review and override AI decisions), and monitoring (reporting malfunctions). Your contract with the AI vendor doesn't transfer these obligations. They're yours.
What are the AI Act deadlines?
The AI Act entered into force on 1 August 2024. Obligations roll out in phases. Prohibited AI practices have been banned since 2 February 2025. AI literacy requirements are also in force since that date — most companies don't realise this. General-purpose AI model obligations applied from 2 August 2025. The big deadline for high-risk AI systems was originally 2 August 2026. In March 2026, the European Parliament voted 569 in favour to delay this to 2 December 2027 through the Digital Omnibus. The delay is moving through trilogue and is expected to be formally adopted by mid-2026. Until then, 2 August 2026 technically remains the law.
What counts as high-risk AI under the EU AI Act?
The AI Act classifies systems by how they're used, not by what technology they run on. The same AI model can be minimal risk for content summarisation and high-risk the moment it's used for hiring decisions. Annex III lists eight high-risk domains: biometrics, critical infrastructure, education, employment, essential services (credit scoring, insurance), law enforcement, migration and border control, and justice. For most mid-market companies, the first encounter with high-risk AI is in HR. CV screening tools, interview scoring systems, and performance monitoring software all qualify. If your company uses any of these, you have high-risk obligations regardless of whether you built the AI yourself.
What are the penalties?
The AI Act has three penalty tiers. Using a prohibited AI system carries fines up to €35 million or 7% of global annual turnover, whichever is higher. Violating high-risk system or transparency obligations carries fines up to €15 million or 3% of global turnover. Providing incorrect information to authorities carries fines up to €7.5 million or 1% of global turnover. For SMEs and startups, fines are capped at the lower of the fixed amount or the percentage. Beyond fines, authorities can order companies to stop using non-compliant AI systems entirely. For a company whose core operations depend on an AI tool, that order is more damaging than any fine.
What should we do first?
First, make a list of every AI tool your company uses. Include third-party SaaS tools with AI features, not just systems you built. This is your AI inventory. Second, check each tool against the AI Act risk categories. Is it in one of the eight Annex III high-risk domains? Most companies find their exposure is in hiring tools and customer-facing chatbots. Third, assign one person to own AI Act compliance. Without a named owner, nothing moves. This takes 2-4 weeks. No consultants needed.
I work with clients on compliance. Can I share this assessment?
Yes. Several compliance consultants and DPO-as-a-service providers use this quiz as a conversation starter with clients. Send the link, the client takes 3 minutes, and you discuss their score in your next meeting. It replaces the awkward "so, how prepared do you think you are?" conversation with concrete data. If you want to discuss co-branded versions or bulk data for your practice, contact us at info@regdossier.eu.
RegDossier
Making EU compliance almost enjoyable. Almost.
Every Tuesday we break down one EU regulation that affects your business. Free.
Get the Tuesday briefing