Updated: April 28, 2026 — Comprehensive 2026 update: current GDPR and EU AI Act practice, state of ChatGPT Enterprise contracts, new token pricing, OpenAI’s mandatory disclosure of 20 million user chats (US copyright dispute), multi-LLM reality (GPT-5, Claude, Gemini), and the deployment model in your own cloud tenant (Azure / Google Cloud / STACKIT).
The rapid development of artificial intelligence offers companies enormous efficiency and productivity gains — and at the same time, using public services like ChatGPT poses real, often underestimated data protection challenges. With GDPR and the EU AI Act, the rule for 2026 is: anyone using ChatGPT for business without a clear data protection and governance concept risks fines, trade secrets, and ultimately client trust.
This article shows, based on 50+ projects, where the risks actually lie, what the paid ChatGPT versions (Team, Enterprise, Business) actually cover — and why the strategically correct answer in 2026 is not “ChatGPT Enterprise” but a private AI platform inside your own cloud tenant.
Deeper context: our pillar guide “AI for Companies 2026” places this article within the larger picture of strategy, platform choice, compliance, and rollout.
The three real risks of ChatGPT in the enterprise
1. Shadow AI: employees use ChatGPT despite bans
Bitkom studies (2024/2025) show: more than 70 % of knowledge workers privately use ChatGPT, Claude, or Gemini — including with company data. An official block on company devices only shifts the problem to private smartphones. Shadow AI is the largest unsolved GDPR risk in German companies in 2026.
The most prominent documented case: Samsung lost confidential semiconductor design data to ChatGPT in 2023, which then flowed into OpenAI’s training pipeline. Similar cases exist in 2025/2026 in droves — usually not public because affected companies stay silent.
More on this in our article on Bring Your Own AI (BYOAI) in mid-market companies.
2. GDPR breach via data transfer to the US
The free ChatGPT version processes data in the US. ChatGPT Plus (private) and ChatGPT Team/Enterprise (business) are also based on US hosting — data transfer happens via the EU-US Data Privacy Framework (DPF), which is under continuous legal scrutiny following Schrems II / Schrems III.
Concretely: even ChatGPT Enterprise with a signed DPA does not protect against the US CLOUD Act. US authorities can access OpenAI data even if it would be physically processed in the EU. For confidential data (trade secrets, HR data, client information, patient data), this is a real flaw — not a theoretical one.
Recent case: in spring 2026, OpenAI was ordered as part of a US copyright dispute to disclose 20 million user chats. We analyzed the case in our article on the disclosure order. Anyone using ChatGPT for business must expect that content can be disclosed under US law.
3. Loss of data control and missing business context
ChatGPT Free / Plus reserves the right to use inputs to train global models. ChatGPT Team / Enterprise contractually rules this out — but: neither audit logs, nor granular permissions, nor a SharePoint connection with permission mirroring are available out of the box. The result: ChatGPT remains a generic assistant with no access to the data treasure that makes your company unique.
What the paid ChatGPT versions actually deliver
| Variant | GDPR status | Training opt-out | EU hosting | Audit logs | RAG on company knowledge | Multi-LLM |
|---|---|---|---|---|---|---|
| ChatGPT Free / Plus | not suitable | no (default) | no | no | no | no |
| ChatGPT Team | partially suitable | yes | no (US, DPF) | rudimentary | limited | no |
| ChatGPT Enterprise | partially suitable | yes | optional, restricted | yes | limited | no |
| ChatGPT Edu | partially suitable | yes | no (US, DPF) | yes | limited | no |
| Microsoft Copilot for M365 | GDPR-suitable (with DPA) | yes | EU Data Boundary | yes | M365 content | no |
| CompanyGPT (innFactory) | fully GDPR-compliant | yes, contractual + technical | own tenant in EU/Germany | complete | complete (incl. permission mirroring) | yes (GPT, Claude, Gemini, Llama, Mistral, Aleph Alpha) |
Microsoft Copilot for M365 is a serious alternative — we detailed the weaknesses in our article on Microsoft Copilot Flex Routing: data processing outside the EU. The central limitation: no multi-LLM strategy, no MCP extensibility, no own workflow layer like n8n.
The better path: AI platform in your own cloud tenant
Instead of a US SaaS or a per-user platform with foreign tenant sovereignty, we recommend at innFactory AI for companies starting at about 20 productive users the own cloud tenant:
- Platform runs in your Azure, Google Cloud, or STACKIT subscription — not at the SaaS vendor
- Data and configuration stay fully under your control; you sign the cloud DPA directly with Microsoft, Google, or STACKIT
- No per-user license model — you pay infrastructure, token costs 1:1 from the hyperscaler, and a fixed-price setup plus optional managed service
- Multi-LLM routing between Azure OpenAI (GPT-5, GPT-4o), Anthropic Claude (via Bedrock/Vertex), Gemini, and sovereign open-source models via STACKIT AI Model Serving
- n8n workflow engine in the same stack for automation, MCP servers, and custom agents
This is exactly the CompanyGPT architecture: an open-source foundation (LibreChat), extended with productive enterprise features (companyRAG for SharePoint with permission mirroring, companyFILES for Office editing, companyTRANSLATE for GDPR-compliant translation), built in the customer’s cloud tenant — on Azure, Google Cloud, or fully sovereign on STACKIT.
We detailed the three deployment variants and their sovereignty levels in our ES³ article.
Three concrete advantages over ChatGPT Enterprise
Advantage 1: tenant sovereignty with the customer
With CompanyGPT there is no “SaaS tenant at the vendor”. The platform runs in your own cloud subscription. If you want to switch vendors tomorrow — replace innFactory with another consultancy — the platform stays in place because it is built on open standards (LibreChat, PostgreSQL, Kubernetes, standard LLM APIs). No vendor lock-in.
Advantage 2: multi-LLM with real model choice
ChatGPT Enterprise is locked to OpenAI models. CompanyGPT routes per use case and per request to the best model:
- GPT-5 / GPT-4o for multimodal tasks and tool use
- Claude Sonnet / Opus for reasoning, long documents, coding (see Claude for enterprise)
- Gemini 2.5 for Google Workspace integration and large multimodal inputs
- Llama / Mistral / Aleph Alpha via STACKIT AI Model Serving for fully sovereign processing
In practice, multi-LLM strategies typically save 30–50 % of token costs while increasing output quality. More in the pillar guide.
Advantage 3: extensibility via MCP, n8n, and custom agents
ChatGPT Enterprise offers GPTs (custom assistants) as a closed construct. CompanyGPT integrates the open Model Context Protocol (MCP) — see our article on MCP as the USB-C interface for LLMs — and n8n as a workflow engine. You can build your own tools, ERP integrations, industry agents, and automations without leaving the platform. We use this productively in our own projects, as the article AI-assisted software development with OpenCode shows.
GDPR and EU AI Act in practice
GDPR obligations (state 2026)
- Data Processing Agreement (DPA) — for CompanyGPT: DPA with innFactory plus cloud DPA with Microsoft / Google / STACKIT
- Data Protection Impact Assessment (DPIA) for high-risk applications (recruiting, automated decisions, credit scoring)
- Deletion concept and data subject rights — for CompanyGPT: full audit trails and programmatic deletion
- International data transfer — avoid where possible; with CompanyGPT optionally fully avoidable (STACKIT variant)
EU AI Act from 2026
The EU AI Act has been in force since August 2024, with staggered deadlines:
- February 2025: prohibited AI practices, obligation for AI literacy (Art. 4)
- August 2025: obligations for general-purpose AI (GPAI)
- August 2026: full applicability for high-risk systems (recruiting, education, critical infrastructure)
Practical consequence: you must train staff, classify high-risk applications, and establish risk management per Art. 9. We provide this as an AI compliance service and AI officer as a service — backed by our advisory board with Andreas Noerr, attorney for IT law, and Prof. Dr. Sebastian Bayerl, professor of applied AI.
Deeper context in NIS2, KRITIS, and the EU AI Act: what mid-market companies need to know now.
What does CompanyGPT cost compared to ChatGPT Enterprise?
ChatGPT Enterprise is typically negotiated from 60 USD per user per month — at 200 users that’s about 144,000 USD per year (~135,000 EUR). On top of that come token costs, implementation, training.
CompanyGPT TCO at 200 users (example calculation):
- Platform setup in customer tenant (one-time): about 30,000–60,000 EUR
- Cloud infrastructure (Azure/STACKIT): about 6,000–18,000 EUR per year, billed directly by the hyperscaler
- Token costs at hyperscaler list price (no markup): about 12,000–30,000 EUR per year
- Managed service by innFactory (updates, optimization, monitoring): about 18,000–36,000 EUR per year
First year: about 66,000–144,000 EUR, from year 2 about 36,000–84,000 EUR — with full data sovereignty, multi-LLM, and extensibility via MCP and n8n. Full TCO comparison in the pillar guide.
Frequently asked questions
Is ChatGPT Enterprise GDPR-compliant? Only conditionally. With a DPA, EU region, and DPF transfer, use is possible, but the US CLOUD Act remains a legal residual risk. For confidential data, a platform operated inside your own cloud tenant (Azure / STACKIT) is significantly safer.
From how many users does CompanyGPT pay off? Economically from about 20 productive users. The fixed-price setup in the own tenant does not scale linearly with user count, so the advantage over per-user SaaS grows with each additional license.
Do we have to choose Microsoft, Google, or STACKIT? No, this depends on your existing cloud strategy. As a Microsoft, Google Cloud, and STACKIT partner, we build the platform in any of these three environments — and combine them on request (e.g., STACKIT for the sovereign core plus Azure for frontier models).
How long does the introduction take? Technical setup in the customer tenant is finished in a few days. Pilot with 10–30 power users in 6 weeks, full rollout typically after 3–6 months.
Can existing ChatGPT Enterprise licenses be replaced? Yes, in most cases sensible. Migration covers custom GPTs, knowledge bases, and prompt templates — we have done this multiple times. Talk to us for a migration roadmap.
What about Microsoft Copilot for M365? A valid alternative for standard products, but without multi-LLM, without MCP, without n8n workflows. Details and weaknesses are analyzed in Microsoft Copilot Flex Routing and the comparison LibreChat vs. Open WebUI vs. Copilot.
Next steps
If you want to reduce ChatGPT risks in your company while still using full AI power — three concrete options:
- Architecture workshop (free, 60 min): we analyze your IT landscape (M365, Google Workspace, on-premise) and give a substantiated platform recommendation — book a meeting.
- CompanyGPT demo: 30-minute live demo of the platform with your own use cases — request a demo.
- EU AI Act readiness check: in four weeks you know where you stand and what to do until full applicability — view service.
We are a German consultancy with both tech and legal DNA — Tobias Jonas (M.Sc. AI/Cloud), Fabian Artmann (M.Eng. industrial engineering), our advisory board with Andreas Noerr (attorney IT law), Prof. Dr. Sebastian Bayerl (applied AI), and Daniel Artmann (member of Bavarian state parliament). Microsoft, Google Cloud, and STACKIT partner. More about us.
Updated by Tobias Jonas, Co-CEO innFactory AI Consulting GmbH. Last update: April 28, 2026. We update this article regularly based on new case law, model releases, and insights from active projects.
