GDPR Art. 28 + US CLOUD Act = liability
The European data protection framework was designed with one core principle: the data subject must retain control. When a company uses an AI tool to process client communications, case files, medical records, or financial documents, it enters a data processing relationship governed by GDPR Article 28. At the same time, US law—specifically the Clarifying Lawful Overseas Use of Data Act (CLOUD Act), signed into law on 23 March 2018—grants US law enforcement the authority to demand data from any company subject to US jurisdiction, regardless of where that data is physically stored.
These two legal frameworks are fundamentally incompatible. GDPR says: "You must ensure data stays protected within defined boundaries." The CLOUD Act says: "We can reach across those boundaries whenever we need to." For any European company using AI services operated by US-headquartered providers, this creates a structural legal conflict that no amount of contractual language can fully resolve.
GDPR Article 28: What it means for AI tools
Article 28 of the General Data Protection Regulation sets out the obligations that apply when a data controller (your company) engages a data processor (the AI service provider). The requirements are specific and non-negotiable:
- Written contract required (Art. 28(3)): A Data Processing Agreement (DPA) must define the subject matter and duration of processing, the nature and purpose, the type of personal data, and categories of data subjects.
- Processor must act only on instructions (Art. 28(3)(a)): The processor may not use data for its own purposes—including model training, analytics, or product improvement. This is routinely violated by consumer AI tools.
- Sub-processor transparency (Art. 28(2)): The controller must approve any sub-processors. With major AI providers, the chain of sub-processors can span dozens of entities across multiple jurisdictions.
- Adequate security measures (Art. 28(3)(c)): The processor must implement technical and organisational measures per Article 32, including encryption, pseudonymisation, and access controls.
- Deletion or return upon termination (Art. 28(3)(g)): When the contract ends, all personal data must be deleted or returned. Retention of any data—even for model improvement—is a violation.
Key takeaway: Under Art. 28(3)(a) GDPR, a data processor may only process personal data on documented instructions from the controller. When you paste client data into ChatGPT, you are sending it to a processor (OpenAI) whose standard terms historically reserved the right to use inputs for model training. Even with the opt-out, the structural compliance gap remains: you cannot audit OpenAI's infrastructure, and you cannot verify deletion.
The penalty framework is severe. Under Article 83(4) GDPR, violations of processor obligations can result in fines up to €10 million or 2% of global annual turnover—whichever is higher. Under Article 83(5), violations that affect data subject rights can double that ceiling to €20 million or 4%.
The CLOUD Act risk: US jurisdiction over EU data
The Clarifying Lawful Overseas Use of Data Act (CLOUD Act, 18 U.S.C. §2713) was passed in response to the Microsoft Ireland case, where the US government sought emails stored on Microsoft servers in Dublin. Rather than wait for the Supreme Court to decide the territorial reach of the Stored Communications Act, Congress enacted a statute that resolved the question explicitly: US companies must comply with lawful data requests regardless of where the data is stored.
This applies to every major AI provider. OpenAI, Google (Gemini), Microsoft (Copilot), Anthropic, Amazon (Bedrock)—all are US-incorporated entities. Even when they offer "EU data residency" and process data on servers in Frankfurt or Dublin, they remain subject to US jurisdiction. A US court or law enforcement agency can issue a warrant or subpoena under the Stored Communications Act (18 U.S.C. §§2701–2712), and the company must comply.
Key takeaway: The CLOUD Act creates a legal reality where physical data location is irrelevant for US-jurisdiction companies. A "Frankfurt region" label on your AWS, Azure, or GCP instance does not protect your data from a US government request. The only structural protection is to ensure your infrastructure provider is not subject to US jurisdiction.
The European Data Protection Board (EDPB) has addressed this tension repeatedly. In its Recommendations 01/2020 on supplementary measures, the EDPB concluded that Standard Contractual Clauses (SCCs) alone are insufficient when the data importer is subject to legislation that conflicts with EU data protection standards. The Schrems II ruling (Case C-311/18, 16 July 2020) invalidated the EU-US Privacy Shield for exactly this reason. While the EU-US Data Privacy Framework (DPF) adopted in July 2023 provides a new adequacy decision, legal scholars and advocacy groups continue to challenge its durability—and it does not address the fundamental CLOUD Act conflict for data processing (as opposed to data transfer).
Professional secrecy: §203 StGB and its consequences
For an entire class of professionals in Germany, the GDPR is not even the strictest standard they must meet. Section 203 of the German Criminal Code (Strafgesetzbuch, StGB) imposes criminal liability on professionals who disclose secrets entrusted to them in their professional capacity. This is not a civil regulation with administrative fines. It is criminal law, punishable by up to one year of imprisonment or a monetary fine.
The scope of §203 StGB covers:
- Lawyers (Rechtsanwälte) — attorney-client privilege (anwaltliche Schweigepflicht)
- Physicians and psychotherapists — medical confidentiality (ärztliche Schweigepflicht)
- Tax advisors (Steuerberater) — tax advisory secrecy
- Auditors (Wirtschaftsprüfer) — audit confidentiality
- Pharmacists — pharmaceutical confidentiality
- Social workers and counsellors — social secrecy (Sozialgeheimnis)
- Sworn engineers and patent attorneys — professional secrecy
The 2017 amendment to §203 StGB (effective 9 November 2017) clarified that professionals may engage external IT service providers ("sonstige Mitwirkende") under certain conditions. However, the service provider must be contractually bound to secrecy, and the professional remains personally liable for any breach. Critically, the service provider must be under the professional's effective control—which is impossible when the provider is a US-headquartered cloud AI service subject to CLOUD Act compulsion.
Key takeaway: Under §203 StGB, a lawyer who sends client case details to ChatGPT is potentially committing a criminal offence. It does not matter that OpenAI offers a DPA. The structural inability to prevent disclosure under the CLOUD Act means the lawyer cannot guarantee the secrecy obligation. The same applies to doctors processing patient data, tax advisors handling financial records, and every other profession listed in §203.
Who needs this? Regulated industries at the highest risk
The intersection of GDPR Article 28, the CLOUD Act, and professional secrecy obligations creates a compliance requirement that is particularly acute for specific industries:
Legal sector
Law firms handle the most sensitive category of personal data: legal proceedings, criminal defence strategies, corporate M&A details, and privileged communications. The Federal Bar Association (BRAK) has issued guidance warning that cloud-based AI tools from US providers may violate §203 StGB obligations. A single instance of processing privileged data through a non-compliant AI tool could constitute a breach of professional duty, trigger disciplinary proceedings, and expose the firm to criminal liability.
Healthcare
Patient data falls under both GDPR Article 9 (special categories of personal data) and the medical confidentiality provisions of §203 StGB. Additionally, healthcare data processing is subject to sector-specific regulations including the Sozialgesetzbuch (SGB) and state-level hospital data protection laws. AI tools that process patient communications, diagnostic notes, or treatment plans must operate under the strictest possible data isolation.
Tax advisory and audit
Tax advisors process detailed financial records, tax returns, salary data, and business financials. Under the Steuerberatungsgesetz (StBerG) and §203 StGB, disclosure of client financial data—even inadvertently through an AI processing pipeline—is a criminal offence. Audit firms face equivalent obligations under the Wirtschaftsprüferordnung (WPO).
Engineering and consulting
Sworn engineers (vereidigte Sachverständige) processing technical reports, patent applications, or expert opinions are bound by professional secrecy. Financial advisors handling client portfolio data face regulatory obligations under MiFID II and national transposition. Any AI tool processing this data must guarantee complete data isolation and jurisdictional compliance.
Why Hetzner Frankfurt? German law, German jurisdiction
The solution to the CLOUD Act problem is structural, not contractual. No amount of SCCs, DPAs, or contractual commitments can override a foreign government's compulsory legal authority. The only reliable mitigation is to ensure that the infrastructure provider itself is not subject to US jurisdiction.
Hetzner Online GmbH is a German company, founded in 1997, headquartered in Gunzenhausen, Bavaria. It is incorporated under German law, has no US parent company, no US subsidiary, and no structural connection to US jurisdiction. Its Frankfurt data centre (FSN1) operates under:
- German Federal Data Protection Act (BDSG) as the national implementation of GDPR
- German Telecommunications Act (TKG) for network operations
- ISO 27001 certification for information security management
- SOC 2 Type II compliance for security controls
When ClapNClaw provisions your private AI server on Hetzner Frankfurt, the resulting infrastructure is governed exclusively by German and EU law. No US court, no US law enforcement agency, and no CLOUD Act warrant can compel Hetzner to disclose your data. This is not a contractual promise—it is a jurisdictional fact.
Key takeaway: Hetzner is a German GmbH with no US presence. Unlike AWS (Amazon), Azure (Microsoft), or GCP (Google), Hetzner is structurally immune to the CLOUD Act. Your data on a Hetzner server in Frankfurt is protected by German law and accessible only through German legal process.
AWS Bedrock EU: GDPR-compliant AI processing
"But wait," you might ask, "if AWS is a US company, doesn't the same CLOUD Act problem apply to AI inference?" This is a valid concern, and ClapNClaw's architecture addresses it through a specific mechanism: AWS Bedrock in the eu-central-1 (Frankfurt) region with zero data retention.
AWS Bedrock operates as an inference API—it processes prompts and returns completions, but does not store input or output data. When zero-data-retention is enabled:
- No prompt data is logged by AWS or the model provider
- No input or output is used for model training or improvement
- No data persists beyond the duration of the API call
- Processing occurs in the Frankfurt region, subject to EU data residency
The critical distinction is between data at rest (stored on your server, hosted by Hetzner, outside US jurisdiction) and data in transit for inference (processed momentarily by AWS Bedrock with no retention). Your persistent data—files, chat history, configurations—never leaves the Hetzner server. Only the specific prompt sent to the AI model passes through Bedrock, and it is discarded immediately after processing.
This architecture means that even in the theoretical scenario of a CLOUD Act request to Amazon, there would be no data to produce—the inference data no longer exists. Combined with the Hetzner-hosted server for all persistent storage, this creates a defence-in-depth approach to jurisdictional compliance.
ClapNClaw sandbox: Control what your agent can access
GDPR compliance is not only about where data is stored. Article 25 (data protection by design and by default) and Article 32 (security of processing) require that technical measures limit data access to what is strictly necessary. An AI agent with unrestricted access to all company files, all email, and all databases violates the principle of data minimisation (Article 5(1)(c)).
ClapNClaw's ClapNClaw sandbox addresses this by providing a controlled execution environment for the AI agent. Every tool the agent can use—file access, web browsing, code execution, API calls—runs inside a sandboxed container with explicitly defined permissions:
- File access scoping: The agent can only access directories you explicitly grant. Client A's files are invisible when working on Client B's case.
- Network isolation: Outbound connections are restricted to approved endpoints. The agent cannot exfiltrate data to external services.
- Tool permissions: Each workspace defines which tools (file read, file write, shell, web search) the agent may use. A read-only research workspace cannot modify files.
- Audit logging: Every action the agent takes is logged with timestamps, enabling full traceability for compliance audits.
For professionals subject to §203 StGB, the sandbox provides the technical equivalent of the "need-to-know" principle. You can use an AI agent for case research without exposing privileged communications. You can draft documents without giving the agent access to your entire file system. This is data protection by design as Article 25 GDPR intends it.
"Not illegal, but risky": Using ChatGPT for company data
Let us be precise about the current legal status. As of March 2026, using ChatGPT or similar US-hosted AI tools for business purposes is not automatically illegal under GDPR. OpenAI offers a DPA, claims to process EU data on EU servers, and provides an opt-out for training data usage. Many companies use these tools daily without immediate legal consequences.
However, the legal landscape is shifting rapidly:
Data protection authorities are increasing enforcement. The Italian Garante temporarily banned ChatGPT in March 2023. The EDPB's ChatGPT Taskforce, established in April 2023, has been coordinating enforcement actions across EU member states. German state-level data protection authorities (Landesdatenschutzbeauftragte) have issued warnings about the use of US-hosted AI tools for processing personal data, particularly in the public sector and regulated professions.
The EU AI Act adds a new layer. Since August 2025, the EU AI Act's transparency and risk-management provisions are fully applicable. Organisations using general-purpose AI models for professional purposes face additional documentation, risk assessment, and transparency obligations. Non-compliance with AI Act provisions can result in fines up to €35 million or 7% of global turnover.
Courts are becoming less tolerant. The trend in German and EU jurisprudence is toward stricter interpretation of data controller obligations. The CJEU's Meta Platforms ruling (Case C-252/21, July 2023) confirmed that data controllers bear the burden of proving compliance—not just claiming it. If you cannot demonstrate that your AI tool's data processing chain is fully GDPR-compliant, the presumption will work against you.
Key takeaway: Using ChatGPT for business data today is like parking in a spot with unclear signage. You might not get a ticket today. But when enforcement catches up—and it always does—the fines will be calculated retroactively from the first day of non-compliance. For professionals bound by §203 StGB, the risk is not a fine. It is a criminal record.
ClapNClaw solves this: Architecture that addresses every legal concern
ClapNClaw was designed from the ground up to resolve the legal tensions described in this article. Here is how each component maps to a specific compliance requirement:
GDPR Art. 28 (processor obligations): Your AI server runs on dedicated Hetzner infrastructure. ClapNClaw provides a comprehensive DPA covering all processing activities. You are the controller; ClapNClaw acts strictly as processor under your documented instructions. No sub-processor has access to your persistent data.
CLOUD Act immunity: All persistent data (files, conversations, configurations, user data) is stored on Hetzner servers in Frankfurt. Hetzner is a German GmbH with no US nexus. Your data is subject exclusively to German legal process.
§203 StGB compliance: The combination of German-jurisdiction hosting, the ClapNClaw sandbox, and audit logging provides the technical and organisational framework required for professionals with secrecy obligations to use AI tools lawfully.
AI inference (AWS Bedrock): AI model processing uses AWS Bedrock eu-central-1 with zero data retention. No prompts or completions are stored. The inference layer is stateless—no data persists beyond the API call.
Data minimisation (Art. 5(1)(c)): The ClapNClaw sandbox enforces the principle of least privilege. Your agent accesses only what you explicitly allow, and every action is logged.
Data protection by design (Art. 25): The architecture implements privacy controls at every layer: server isolation, sandbox scoping, network restrictions, and audit trails. These are not bolt-on features—they are structural design decisions.
Right to erasure (Art. 17): Because all data resides on your dedicated server, deletion is complete and verifiable. There is no data scattered across training pipelines, analytics systems, or sub-processor caches.
For a detailed breakdown of the compliance architecture, see our compliance overview. To understand how ClapNClaw differs from consumer AI tools, read ChatGPT vs. a private AI agent.
Ready to make your AI usage legally bulletproof?
Book a compliance demo and we will walk through the architecture, the DPA, and how ClapNClaw maps to your specific regulatory obligations.
Book compliance demo