EU AI Act compliance for Luxembourg SMEs: what to do before 2 August 2026
By LetzAgents, Luxembourg AI compliance team. Published on 18 April 2026. Updated on 18 April 2026.
In Brief
- Deadline: on 2 August 2026, the EU AI Act becomes fully applicable to high-risk systems under Annex III, and national penalties come into force (source: Regulation EU 2024/1689).
- Already in force: Article 4 (AI literacy) has applied to every company using an AI system since 2 February 2025, with no size threshold (source: artificialintelligenceact.eu).
- Luxembourg authority: the CNPD, through bill no. 8476, operates the national regulatory sandbox (source: cnpd.public.lu, chd.lu).
- Funding: the SME Package AI covers 70% of eligible costs for a project of €3,000 to €25,000 (excl. VAT), up to €17,500 in subsidy (source: guichet.public.lu).
Introduction
You run a Luxembourg SME and the date of 2 August 2026 shows up everywhere when the EU AI Act comes up. Regulation EU 2024/1689 has been entering into force in stages since February 2025: some obligations already apply, others reach full applicability in just over one hundred days. This guide spells out what a Luxembourg SME must document, train for and track before the summer, with a local anchor: CNPD, bill no. 8476, SME Package AI via guichet.lu.
1. What changes on 2 August 2026
Three things shift on that date for every company active in Luxembourg.
Obligations for high-risk AI systems under Annex III become fully enforceable: automated recruitment, credit scoring, certain HR tools, management of essential services (source: Regulation EU 2024/1689).
Article 85 opens the right to file a complaint with the national market surveillance authority (source: artificialintelligenceact.eu). That single window will be the CNPD.
National penalties come into effect: up to €35 million or 7% of global turnover for prohibited practices, €15 million or 3% for non-compliant high-risk systems, €7.5 million or 1% for transparency failures (source: Regulation EU 2024/1689, articles 99 onwards).
For most non-financial SMEs, everyday uses (chatbot, internal assistant, assisted writing) do not fall under high risk. What concerns you: transparency, training, documentation.
💡 Worth knowing: Luxembourg concentrates AI Act responsibility on the CNPD through bill no. 8476, the default national market surveillance authority (source: cnpd.public.lu, chd.lu). A single public point of contact for your AI compliance questions.
2. AI literacy (Article 4), already in force
Article 4 of the EU AI Act, dedicated to AI literacy, has been in force since 2 February 2025 (source: artificialintelligenceact.eu/article/4). It targets providers and deployers of AI systems, with no size threshold: any organisation that uses an AI system falls in scope.
The text requires your staff and the third parties operating these systems to have a sufficient level of AI competence, calibrated to their prior training, the context of use and the persons affected (source: official AI Act text).
The nuance that matters: the national penalties tied to Article 4 activate on 2 August 2026, but the obligation itself is already live. A complaint filed in August 2026 over a use that took place in 2025 without adequate training can rely on that recent non-compliance. As covered in our analysis of the risks of consumer ChatGPT in the workplace, map out what your teams use day to day.
3. The four risk categories for an SME
The EU AI Act sorts systems into four categories. Here is the triage for a typical Luxembourg SME.
|
Category |
Luxembourg SME examples |
Present in SMEs? |
Recommended action |
|---|---|---|---|
|
Unacceptable (prohibited) |
Social scoring, behavioural manipulation, emotion recognition in a professional context |
No, very rare uses |
Check Article 5 if you operate a surveillance system |
|
High risk (Annex III) |
Automated CV screening, credit scoring, biometrics, critical infrastructure |
Mostly regulated sectors and large-scale HR |
Full audit, documentation, human oversight, registration in the EU database |
|
Limited risk (transparency) |
Customer chatbot, internal AI assistant, generated marketing content, synthetic voice |
Yes, the majority case |
Inform users they interact with an AI, label generated content |
|
Minimal |
Spam filters, standard product recommendations, spell-checkers |
Yes, everywhere |
No new obligations. Internal best practices recommended |
For a Luxembourg fiduciary, law firm or real-estate agency running common AI tools, the core of the effort sits inside "limited risk". Sector-specific vigilance: see our pages on fiduciaries and lawyers and notaries.
4. 100-day compliance checklist
Six actions to kick off before 2 August 2026, more disciplinary than budgetary.
- Inventory the AI systems in use. Public chatbots, tools embedded in office suites, browser plugins, CRM or accounting extensions. Include informal uses.
- Map vendors and data location. For each system: vendor, servers, jurisdiction, handling of submitted data.
- Build an AI literacy training plan. Target audience, format, frequency, traceability. Framing your enterprise AI strategy upstream makes the articulation easier.
- Put transparency notices in place. Chatbot announced as AI, generated content labelled.
- Draft an internal AI usage policy. Approved tools, forbidden data (professional secrecy, client data, health data), contact point in case of doubt.
- Keep an AI incident log. Anomalies (hallucination, suspected leak, biased decision) logged with date, system, action taken.
Order of execution: items 1 and 2 in April and May, items 3 to 5 in June, item 6 on an ongoing basis.
5. Luxembourg resources to tap into
The CNPD (cnpd.public.lu) is your reference authority. In January 2026, during the "AI Act in Action" conference at the Luxembourg Chamber of Commerce in front of more than 300 participants, it announced the ramp-up of several tools (source: cnpd.public.lu, press release of 20 January 2026).
The CNPD regulatory sandbox lets you test an innovative AI use case under supervision, with provisional legal framing. Useful if you hesitate between limited risk and high risk. The RE.M.I. (Regulation Meets Innovation) initiative is the dedicated channel for dialogue between innovators and the regulator (source: cnpd.public.lu, RE.M.I. dossier). The "Data Protection Basics: AI" training sessions qualify as a valid building block of your AI literacy programme for staff handling AI systems on personal data.
6. Funding compliance: the SME Package AI
The Luxembourg State rolled out a dedicated AI package for SMEs in March 2025. The SME Package AI reimburses 70% of eligible costs of an AI integration project, for an amount of €3,000 to €25,000 (excl. VAT), meaning up to €17,500 in subsidy (source: guichet.public.lu, SME Package AI factsheet).
Typical fundable items: compliance audit, system mapping, AI literacy training, deployment of an AI solution compliant by design. The company runs the project, pays the provider, then gets reimbursed by the Ministry of the Economy via guichet.lu.
The SME Package AI coexists with the SME Package Digital: on distinct projects, stacking is allowed. The House of Entrepreneurship and the Chambre des Métiers support the application process. For the economic trade-offs, see our guide on the cost of private AI for SMEs in Luxembourg.
7. US cloud LLMs and sovereignty
A technical point that makes the difference: the data processing chain when you use a large language model hosted in the United States. LLMs operated from the US by players such as OpenAI, Anthropic, Google or Microsoft fall under the US Cloud Act (2018 law), which allows US authorities to require access to data held by a US provider regardless of its location. These players offer solid services: the question is how that lines up with GDPR and the EU AI Act.
For a Luxembourg SME in a regulated sector (fiduciary, law firm, medical practice, insurance broker, parapublic entity), that overlay makes the compliance demonstration heavier. A sovereign private LLM, hosted in Europe under European control, simplifies those demonstrations by design: data within the European legal framework, direct auditability, traceability inside a single log. See our resource on protecting data through a private AI.
8. Three decisions to make this week
Appoint an internal AI lead. One person, not necessarily the CIO, accountable for the compliance file and for training coordination. Without an owner, the topic stays theoretical.
Launch the AI systems inventory. Exhaustive list, built in consultation with each department.
Check your eligibility for the SME Package AI. A pre-diagnostic confirms whether your project (audit, training, eventual private AI deployment) sits within the €3,000 to €25,000 (excl. VAT) subsidised at 70%. Why LetzAgents supports this effort with local anchoring and sovereign AI infrastructure.
FAQ: Your questions about the EU AI Act in Luxembourg
1. Who is the AI Act authority in Luxembourg?
The CNPD is designated as the default national market surveillance authority for the AI Act, through bill no. 8476. It combines single point of contact, oversight of high-risk systems under Annex III, and operation of the regulatory sandbox. Single counterpart for any AI compliance question in Luxembourg (source: cnpd.public.lu, chd.lu).
2. When does the AI Act apply to my SME?
Article 4 on AI literacy has applied since 2 February 2025 to every company using an AI system, with no size threshold. Obligations for high-risk systems under Annex III and national penalties become fully applicable on 2 August 2026 (source: Regulation EU 2024/1689).
3. What are the obligations for a non-financial SME?
Three axes for most SMEs outside finance and large-scale automated HR: AI literacy training for staff (Article 4, already in force), transparency towards users of limited-risk systems, documentation of internal AI uses. High risk stays marginal without automated decision-making on sensitive files (source: artificialintelligenceact.eu).
4. Does the SME Package AI fund compliance?
Yes. It reimburses 70% of eligible costs of an AI project for a Luxembourg SME, between €3,000 and €25,000 (excl. VAT), up to €17,500 in subsidy. Fundable items: audit, mapping, AI literacy training, deployment of a compliant AI solution. Application via guichet.lu (source: guichet.public.lu).
5. What are the penalties for non-compliance?
Effective on 2 August 2026 in three tiers: up to €35 million or 7% of global turnover for prohibited practices, €15 million or 3% for non-compliant high-risk systems, €7.5 million or 1% for transparency failures (source: Regulation EU 2024/1689, articles 99 onwards). For an SME, the practical priority remains documentation and training.
Preparing for the EU AI Act without overreaction
The EU AI Act in Luxembourg does not force a radical overhaul on most SMEs. It demands discipline: inventory, document, train, track. A hundred days are enough to reach an operational compliance level if the topic is owned by an internal lead and backed by the SME Package AI. Do not wait until 2 August to kick off internal training: Article 4 is already in force.
About the author
LetzAgents deploys a sovereign private AI for Luxembourg SMEs and regulated organisations, hosted in Europe, compliant with the GDPR and the EU AI Act. Our team has been tracking the national implementation of the AI regulation since its publication in June 2024.
This article is based on Regulation EU 2024/1689, official CNPD communications (cnpd.public.lu), bill no. 8476 (chd.lu), guichet.public.lu factsheets on the SME Packages, and the official portal artificialintelligenceact.eu.
Keywords
eu ai act sme luxembourg, ai act compliance business, ai act deadline 2 august 2026, article 4 ai literacy obligation, cnpd ai act authority luxembourg, mandatory ai training company, ai act high-risk systems sme, sme package ai luxembourg compliance, ai act bill 8476, cnpd regulatory sandbox



