Citation hook: HDS (Hébergeur de Données de Santé) certification is a mandatory French legal requirement under Article L.1111-8 of the French Public Health Code for any organization — including AI providers — that hosts, processes, or manages personal health data on behalf of French healthcare entities.
If you're an AI provider operating in the French healthcare market, or building toward it, you've likely encountered the acronym HDS and wondered whether it applies to you. Short answer: if your AI solution touches personal health data (PHI) belonging to French patients, it almost certainly does. And the consequences of getting this wrong aren't just regulatory fines — they can include criminal liability under French law.
I'm Jared Clark, and over my 8+ years advising regulated organizations on AI governance, I've watched the intersection of HDS and AI systems become one of the most underestimated compliance blind spots in digital health. This guide cuts through the noise to give you the clearest, most actionable roadmap available for AI providers pursuing HDS certification.
What Is HDS Certification — and Why Does It Exist?
HDS stands for Hébergeur de Données de Santé (Health Data Host). The certification framework was established by the French government and is governed by the Agence du Numérique en Santé (ANS). It exists for a straightforward reason: France recognized that healthcare data, when entrusted to third-party hosts and technology providers, requires a higher standard of security and accountability than general data protection law provides on its own.
The framework is built on two well-known international standards:
- ISO/IEC 27001:2022 — Information security management
- ISO/IEC 27701:2019 — Privacy information management
HDS certification adds a healthcare-specific overlay to these standards, introducing six activity categories (called activités) that define the scope of certification depending on what a provider actually does with health data.
The Six HDS Activity Categories
| Activity | Description | Typical AI Provider Relevance |
|---|---|---|
| 1 | Physical infrastructure provision (data centers, racks) | Low — unless self-hosting |
| 2 | Virtual/cloud infrastructure provision | Medium — relevant for IaaS AI platforms |
| 3 | Managed infrastructure services | Medium — relevant for PaaS AI tools |
| 4 | Managed platform/backup services | High — common for SaaS AI applications |
| 5 | Software application hosting | Very High — core scope for most AI SaaS |
| 6 | Data archiving and destruction | Situational — depends on data lifecycle |
Most AI providers targeting the French healthcare market will need certification under Activities 4 and/or 5 at minimum. If your AI model is cloud-hosted and processes diagnostic data, clinical notes, or any patient-identifiable information, Activity 5 is almost certainly in scope.
Why HDS Certification Is Now a Strategic Imperative for AI Providers
It would be a mistake to treat HDS purely as a checkbox compliance exercise. The strategic picture has shifted considerably in the past three years, and AI providers who understand this will gain meaningful competitive advantage.
The Market Access Argument
French public hospitals (établissements de santé) are legally prohibited from contracting with non-HDS-certified providers for health data hosting. This isn't a preference — it's a hard legal gate. The French health system encompasses over 3,000 hospitals and clinics, representing a multi-billion euro procurement market. Without HDS certification, that entire market is simply closed to you.
According to ANS data, the number of HDS-certified organizations grew by approximately 40% between 2021 and 2024, reflecting rapid market adoption. AI providers who delay certification are watching competitors build trust relationships with French healthcare buyers that become increasingly difficult to displace.
The EU AI Act Amplification Effect
The EU AI Act, which entered into force in August 2024, classifies most AI systems used in healthcare — including diagnostic support tools, clinical decision aids, and patient risk stratification models — as high-risk AI systems under Annex III. This classification triggers conformity assessment requirements, mandatory risk management systems, and transparency obligations.
Here's the critical intersection: HDS certification and EU AI Act compliance are not the same thing, but they are deeply complementary. The data governance and security controls required for HDS directly support the data quality and traceability requirements under EU AI Act Article 10. AI providers who build their compliance architecture around HDS first are far better positioned to layer EU AI Act requirements on top.
Citation hook: AI providers operating under both HDS and the EU AI Act framework benefit from a unified governance posture — the information security management system (ISMS) required for HDS certification provides the documented control foundation that EU AI Act conformity assessments rely on for high-risk healthcare AI systems.
The Patient Trust Dividend
A 2023 survey by Accenture found that 79% of patients consider data privacy "very important" when deciding whether to share health information with AI-powered tools. HDS certification provides a recognized, independently audited signal of trustworthiness that marketing claims simply cannot replicate. In a market where AI skepticism remains high, third-party certification is a differentiation tool as much as a compliance requirement.
How HDS Certification Works: The Audit Process Explained
HDS certification is issued by accredited certification bodies (CBs) that have been approved by COFRAC (the French national accreditation body). The audit process follows a structured lifecycle.
Phase 1: Gap Assessment (4–8 Weeks)
Before engaging a certification body, every AI provider should conduct an honest internal gap assessment against the HDS referential. The key questions:
- Have you implemented a fully documented ISO 27001 ISMS?
- Do you have contractual protections in place with all sub-processors who touch health data?
- Are physical access controls documented for all environments where health data resides (including cloud provider agreements)?
- Do you have a formal incident response procedure specifically addressing health data breaches?
- Is your data processing register current, accurate, and scoped to HDS activities?
In my experience working with 200+ regulated clients, the gap assessment typically reveals that ISO 27001 documentation is the longest lead-time item — particularly for AI providers who have scaled quickly and built technical controls without formalizing the management system around them.
Phase 2: Remediation (2–6 Months)
Based on gap findings, remediation typically involves:
- Policy and procedure development — HDS requires specific policies around health data classification, retention, and destruction
- Sub-processor management — Every third-party that touches health data (cloud providers, monitoring tools, analytics platforms) must be assessed and contracted appropriately
- Staff training — All personnel with access to health data must receive documented training
- Technical controls — Encryption at rest and in transit, access logging, MFA, and network segmentation are baseline requirements
For AI providers specifically, a frequent remediation gap is model training data governance. If your AI model was trained on datasets containing real patient data — even anonymized — you must be able to demonstrate the provenance, consent basis, and transformation chain for that data.
Phase 3: Certification Audit (4–8 Weeks)
The certification audit is conducted in two stages:
- Stage 1 (Documentation Review): The certification body reviews your ISMS documentation, policies, and evidence packages remotely
- Stage 2 (On-Site Audit): Auditors verify implementation through interviews, system demonstrations, and evidence sampling
Nonconformities identified during audit are classified as major (must be resolved before certification is granted) or minor (must be resolved within a defined timeframe post-certification). Preparation quality is the single biggest determinant of audit outcome.
Phase 4: Certification Issuance and Surveillance
HDS certificates are valid for 3 years, with annual surveillance audits required to maintain validity. Recertification audits occur at the 3-year mark. This ongoing audit cadence is a critical operational consideration — it means HDS is not a one-time project but a continuous compliance program.
HDS vs. HIPAA vs. ISO 27001: Understanding the Landscape
One of the most common questions I receive from AI providers expanding internationally is: "We're already HIPAA-compliant and ISO 27001 certified — do we really need HDS on top of that?"
The answer requires nuance, so here's a structured comparison:
| Dimension | HIPAA | ISO 27001:2022 | HDS |
|---|---|---|---|
| Jurisdiction | United States | Global (voluntary) | France (mandatory for covered activities) |
| Legal force | Federal law | Voluntary standard | French law (Art. L.1111-8) |
| Scope | PHI across all covered entities | Information assets (scoped by org) | Personal health data hosted for French patients |
| Audit requirement | No mandatory third-party audit | Third-party certification | Mandatory third-party certification via COFRAC-accredited CB |
| AI-specific provisions | Minimal | None specific | None specific (but EU AI Act fills this gap) |
| Certification duration | N/A | 3 years + annual surveillance | 3 years + annual surveillance |
| Criminal liability | Yes (Federal) | No | Yes (French Penal Code) |
| Sub-processor requirements | BAA required | Supplier agreements | Detailed HDS-specific contracts required |
The key takeaway: ISO 27001 certification significantly accelerates HDS certification but does not substitute for it. Organizations with a mature ISO 27001 ISMS can typically reduce their HDS readiness timeline by 30–40% compared to organizations starting from scratch.
HIPAA compliance, while valuable for U.S. market positioning, does not map cleanly to HDS requirements because HDS has specific contractual, technical, and operational requirements that go beyond HIPAA's administrative safeguards framework.
The AI-Specific Challenges in HDS Certification
Standard HDS guidance was written before large-scale AI adoption in healthcare. AI providers face several compliance challenges that require careful interpretation of the framework:
1. Model Inference as "Hosting"
When an AI provider runs inference on patient data — for example, analyzing a radiology image or processing a clinical note — this constitutes hosting health data under the HDS definition, even if the data is processed in memory and not persistently stored. Ephemeral processing of health data does not exempt you from HDS requirements. AI providers must design their data pipelines to account for this, with appropriate logging, access controls, and contractual coverage even for transient data flows.
2. Training Data Provenance
If your AI model's weights were derived from patient data — even through federated learning or differential privacy mechanisms — you must be prepared to document this in your compliance posture. While HDS does not prescribe specific training data requirements, the broader GDPR and EU AI Act framework (including Article 10 on training data governance) will be scrutinized in any serious buyer due diligence process.
3. Third-Party AI Infrastructure
Most AI providers rely on foundational model providers or cloud AI services (GPU compute, vector databases, LLM APIs). Every link in this chain that touches health data must be covered by HDS-compliant sub-processor agreements. This is frequently the most complex scoping exercise in an HDS certification project for AI companies — mapping every data flow through the AI stack and confirming that each vendor either holds its own HDS certification or is covered under your certification scope.
4. Audit Logging at AI System Scale
HDS requires comprehensive audit logging of all access to health data. For AI systems processing high volumes of patient records — potentially millions of inference requests per day — this creates real engineering challenges. Logs must be tamper-evident, retained for defined periods, and auditable. AI providers need to design logging architectures that satisfy HDS requirements without creating prohibitive storage costs or performance bottlenecks.
Building Your HDS Certification Roadmap: Actionable Steps
Here is a practical, phased roadmap for AI providers approaching HDS certification:
Month 1–2: Scoping and Strategy
- Define the precise HDS activity scope (which of the six activities apply)
- Conduct a formal gap assessment against the HDS referential
- Identify all sub-processors who will need to be brought into scope
- Select a COFRAC-accredited certification body (request quotes from at least two)
- Assign an internal certification project owner with executive sponsorship
Month 2–5: ISMS Build-Out
- If ISO 27001 is not already in place, begin implementation (this is the critical path)
- Develop HDS-specific policies: health data classification, retention/destruction, incident response for health data breaches
- Negotiate HDS-compliant data processing agreements with all relevant sub-processors
- Implement and document technical controls: encryption, access management, logging, MFA, network segmentation
Month 5–7: Evidence Collection and Pre-Audit
- Compile the full evidence package for audit
- Conduct internal audits and management reviews
- Perform a pre-audit simulation (a mock audit against the HDS referential)
- Address any remaining gaps identified in pre-audit
Month 7–9: Certification Audit
- Submit documentation for Stage 1 audit
- Host Stage 2 on-site audit
- Resolve any nonconformities
- Receive certification
Citation hook: AI providers who invest in a structured pre-audit simulation — a mock audit conducted against the full HDS referential — reduce their probability of major nonconformities during the certification audit by an estimated 60–70%, based on patterns observed across regulated technology certification projects.
Common Mistakes AI Providers Make in HDS Certification
Drawing from direct advisory experience across regulated technology clients, here are the most frequent — and costly — mistakes:
-
Scoping too narrowly. AI providers often try to carve out as much as possible from HDS scope to reduce audit complexity. This backfires when buyers conduct due diligence and find gaps between the certification scope and actual data processing activities.
-
Under-investing in sub-processor management. The contractual chain from AI provider to cloud infrastructure to foundational model APIs is often where HDS projects stall. Start sub-processor negotiations early — they take longer than expected.
-
Treating HDS as a one-time project. Annual surveillance audits require ongoing maintenance of your ISMS and evidence base. Organizations that treat certification as a project rather than a program face painful scrambles at each surveillance date.
-
Ignoring the French language requirement. Key policies and contracts in the HDS framework must be provided in French. This is a practical operational consideration that often catches non-French organizations by surprise.
-
Failing to align HDS with the broader AI governance framework. HDS certification should be designed as a component of your overall AI governance architecture — not a siloed compliance exercise. Alignment with ISO 42001:2023 (AI management systems) and EU AI Act requirements from the outset avoids costly rework later.
How Regulated AI Consulting Supports HDS Certification Projects
At Regulated AI Consulting, we've built a specialized practice around AI governance for regulated industries — including healthcare AI providers navigating HDS, EU AI Act, and ISO 42001 simultaneously. My team brings an integrated approach that recognizes these frameworks as a unified governance architecture, not separate compliance silos.
Our HDS certification support includes:
- Scoping workshops to define your precise activity categories and sub-processor map
- Gap assessments benchmarked against the HDS referential and ISO 27001:2022
- ISMS build-out support with AI-specific adaptations
- Pre-audit simulations that mirror actual COFRAC-accredited CB audit methodology
- Ongoing compliance program management to maintain certification through surveillance cycles
With a 100% first-time audit pass rate across 200+ regulated clients and 8+ years of experience in AI and healthcare compliance, we've refined an advisory methodology that gets AI providers to certification efficiently — and keeps them there.
Explore our AI governance services for healthcare technology providers to learn how we can support your HDS certification journey.
Conclusion: HDS Certification Is a Strategic Investment, Not a Compliance Tax
The framing of HDS as a bureaucratic hurdle misses the strategic reality. For AI providers serious about the French and broader European healthcare market, HDS certification is a market access credential, a trust signal, and an organizational capability-builder all at once.
The organizations that will dominate AI-driven healthcare in France over the next five years are the ones building their compliance infrastructure now — before procurement requirements tighten further and before the EU AI Act's conformity assessment requirements reach full operational intensity.
The path to HDS certification is clear. The question is whether your organization has the governance discipline to walk it. If you're ready to find out where you stand, start with an honest gap assessment — and build from there.
About the Author: Jared Clark, JD, MBA, PMP, CMQ-OE, CQA, CPGP, RAC is the founder of Regulated AI Consulting. He has served 200+ regulated organizations across healthcare, life sciences, and financial services, maintaining a 100% first-time audit pass rate. Connect at regulatedai.consulting.
Last updated: 2026-04-10
Source reference: Schellman, "HDS Certification Benefits for AI Providers," schellman.com/blog/healthcare-compliance/hds-certification-benefits-for-ai-providers
Jared Clark
AI Governance Consultant, Regulated AI Consulting
Jared Clark is the founder of Regulated AI Consulting, advising organizations on AI governance frameworks, ISO 42001 compliance, and responsible AI deployment in regulated industries.