If you are introducing artificial intelligence in your company in 2026, the question is no longer “whether” but “how”. Three quarters of German enterprises are already experimenting with generative AI — yet only a fraction make the leap from pilot to productive operation. This guide consolidates what we at innFactory AI have learned from over 50 AI projects: from strategy to rollout, from GDPR check to EU AI Act, from platform choice to measurable ROI.
Why AI in the enterprise must be a strategic decision now
The last 18 months have created three realities every executive needs to know:
- Shadow AI is everywhere. German Bitkom studies show that over 70 % of knowledge workers privately use ChatGPT, Claude or Gemini — including with company data. Whoever does not provide an official platform loses control over sensitive information.
- The EU AI Act is binding. Phased into force since August 2024, with first obligations in February 2025 (Art. 4: AI literacy) and full requirements for general-purpose AI models from August 2026. Failing to act means risking fines up to €35 million or 7 % of global annual turnover.
- Productivity gains are measurable. McKinsey estimates the potential of generative AI at USD 2.6–4.4 trillion annually — distributed across customer operations, marketing, software engineering and R&D. Companies that anchor AI structurally are pulling ahead.
The central task that emerges: AI must be introduced today as strategic infrastructure, not as a collection of tools.
The five phases of a professional AI rollout
In our projects, a 5-phase model has proven effective. Each phase has a clear deliverable and handover — no endless pilot loops.
Phase 1: Strategy & use case mapping (weeks 1–4)
Before a single tool is rolled out, answer:
- Which three business areas benefit most from AI? (typically sales, customer service, IT/engineering)
- Which data classes may be processed? (public, internal, confidential, strictly confidential)
- Which compliance frameworks apply? (GDPR, EU AI Act, sector-specific)
- What is the target maturity level in 12 months? (awareness → experiment → adoption → integration → transformation)
The result is an AI strategy with a prioritised use case backlog, budget envelope and governance outline.
Phase 2: Platform selection (weeks 4–8)
The platform decision determines data sovereignty, scalability and downstream costs. The evaluation dimensions that drive the decision in our projects:
- Tenant ownership: Does the platform run in the customer’s own cloud account (Azure, Google Cloud, STACKIT) — or in the vendor’s multi-tenant SaaS? Only the former gives full control.
- Licence model: Per-user SaaS licences scale expensively and unpredictably. A fixed-price build in your own cloud, where you pay token costs 1:1 to the hyperscaler, is significantly more predictable.
- Model flexibility: multi-LLM (Azure OpenAI, Anthropic via Bedrock/Vertex, Gemini, Llama/Mistral/Aleph Alpha via STACKIT) instead of vendor lock-in.
- Integration depth: SharePoint with permission mirroring, Confluence, Salesforce, ERP — direct access, not just file upload.
- Governance: audit logs, roles, policy enforcement, cost tracking per team.
- Extensibility: API, MCP servers, custom agents, workflow engine (n8n) — operated in your own stack, not behind a SaaS API.
This is exactly where our CompanyGPT architecture sits: We build the platform on an open-source foundation (LibreChat) inside the customer’s cloud tenant — in Azure, Google Cloud or fully sovereign on STACKIT. Infrastructure, data and model access stay entirely with the customer. Three deployment variants (STACKIT Only, STACKIT + Hyperscaler, Azure-native) cover all sovereignty requirements — we have detailed this in our ES³ article. For direct comparisons with SaaS-based competitors, see our articles on Langdock, Logicc, neuland.ai and Telekom Business GPT.
Phase 3: Compliance & governance (weeks 6–12, parallel to phase 2)
This is the decisive difference between a project that fails audit and one that scales:
- GDPR compliance: data processing agreement (DPA), data protection impact assessment (DPIA) for high-risk applications, deletion concept, data subject rights
- EU AI Act: classification of the application (prohibited, high-risk, transparent, minimal), conformity assessment, risk management system per Art. 9
- Internal governance: AI officer (mandatory in many sectors), AI acceptable use policy, training obligation per Art. 4
- Sector regulation: § 203 German Criminal Code (professional secrecy), bar association rules, banking regulation, MDR (medical devices)
We have detailed this topic with our advisory board member Andreas Noerr (specialist IT lawyer) in a dedicated whitepaper.
Phase 4: Pilot & rollout (weeks 8–24)
The most common mistake: a single mega-pilot with 500 people that fizzles out after three months. A better approach:
- Champions pilot (10–30 people): power users from all departments, intensive onboarding, weekly exchange
- Adoption wave (100–300 people): structured training, sector-specific use case library, office hours
- Full rollout (all employees): self-service onboarding, automatic licence assignment via SCIM, continuous monitoring
Success metrics we track in every project:
- Weekly active users (WAU) ÷ licences → target: >60 % after 90 days
- Number of productive use cases → target: >5 per department after 6 months
- Time saved per user → target: >2 hours/week
Phase 5: Scaling & automation (from month 6)
Chat becomes agent, tool becomes platform. In this phase:
- Sector-specific assistants for sales, HR, legal, customer service
- Workflow automation via n8n and AI agents
- RAG systems on company knowledge (SharePoint, Confluence, custom databases)
- Custom AI apps for regulated processes (contract review, claims handling, application analysis)
- Cost optimisation: right-sizing models, caching, prompt tuning
GDPR and EU AI Act: what really applies
In every initial consultation, we hear the same three myths — and need to dismantle them before productive work begins.
Myth 1: “ChatGPT is GDPR compliant if we use the paid version.”
False. OpenAI’s ChatGPT Team and Enterprise process data with privacy commitments, but the server location is outside the EU. This does not preclude usage but requires proper data transfer based on the EU-US Data Privacy Framework — which does not protect against the US CLOUD Act. For confidential data (trade secrets, HR data, client information), an EU-sovereign platform is significantly safer. Details in our article ChatGPT for Business: Risks & GDPR Status.
Myth 2: “The EU AI Act only affects providers, not us as users.”
False. The AI Act distinguishes between providers, deployers, importers and distributors. As a company using AI — for example for automated decisions in recruitment or credit scoring — you are a deployer with your own obligations: risk management, human oversight, documentation, training.
Myth 3: “We just need a data protection officer, that’s enough for AI.”
False. GDPR and EU AI Act overlap but are not identical. The DPO covers personal data — the AI officer (or comparable function) covers risk management, model governance and employee training under Art. 4 AI Act. We offer both as a service if you do not want to build a dedicated full-time role.
AI platforms: build, buy or run in your own tenant?
Three paths lead to productive AI in the enterprise — and we recommend the third for the vast majority of mid-market companies and corporates.
Path 1: Build on hyperscaler (Azure OpenAI, Google Vertex, AWS Bedrock)
Sweet spot: Corporates with large IT teams that want full control and deep custom integrations — and the capacity to evolve a platform for years.
Reality: The first weeks are euphoric, then come authentication, logging, cost management, RAG, agents, UI — suddenly you are building a platform instead of producing AI use cases. We have seen this multiple times and reverse-engineered for several clients.
Path 2: SaaS platforms (multi-tenant)
Sweet spot: Companies that want to start fast and have no own cloud strategy.
Reality: Works for standard use cases. Three weaknesses typically appear after 6–12 months: (1) per-user licences become expensive at broad adoption, (2) vendor token mark-ups (typical: 10–30 %) compound, (3) data and configuration sit in a tenant at the vendor — vendor switch or insolvency means migration. Our comparison articles on Langdock, Logicc, neuland.ai and Telekom Business GPT show the detail differences.
Path 3: Platform in your own cloud tenant — our recommended path
Sweet spot: Companies from around 20 productive users upwards, who want data control, predictable cost and extensibility — from mid-market to DAX corporate.
Reality: An open-source-based platform (e.g. LibreChat at the core) is installed in the customer’s cloud tenant — Azure, Google Cloud or STACKIT, depending on existing IT strategy. Model access runs through the customer’s own hyperscaler APIs (Azure OpenAI, Vertex AI, Bedrock, STACKIT AI Model Serving), token costs go 1:1 to the hyperscaler. Extensions (custom agents, MCP servers, n8n workflows, sector-specific apps) live in the same stack.
This is exactly the CompanyGPT philosophy: no SaaS licences, no token mark-ups, no shared tenant. Setup takes a few days, operations are reduced to your own stack. As Microsoft, Google Cloud and STACKIT partners we bring the experience to set this up to best practices — and on request we take over operations, updates and extensions.
Top use cases by department
Here, condensed, the use cases that delivered the largest ROI in our projects — a detailed list is in the cluster article AI Assistant for Companies: 7 Use Cases.
| Department | Top use case | Time saved/week |
|---|---|---|
| Sales | Quote generation with ERP data | 4–6 h |
| Customer service | Ticket triage and reply drafts | 6–10 h |
| HR | Onboarding assistant, job ads | 3–5 h |
| Legal | Contract review against internal compliance | 5–8 h |
| Finance | Receipt review, Excel analysis | 3–6 h |
| Marketing | SEO content, newsletters, campaigns | 5–8 h |
| Engineering | Code reviews, documentation, tests | 6–12 h |
What does AI in the enterprise really cost?
An honest TCO calculation depends fundamentally on the licence model. Two paths for a company with 1,000 employees:
Model A: SaaS platform with per-user licence
- Platform licences: €15–35 / user / month → €180k–420k annually
- Token mark-ups (often 10–30 % on API prices): €20k–60k annually
- Workflow modules separately: from approx. €6.5k/year
- Initial implementation: €30k–80k
- Training & change management: €20k–80k in year one
- AI officer (service or in-house): €24k–90k annually
- Compliance audit: €15k–40k annually
Year one: approx. €295k–780k, from year two approx. €245k–620k.
Model B: Platform in your own cloud tenant (CompanyGPT approach)
- Platform build in customer tenant (one-off, fixed price): €30k–90k
- Cloud infrastructure (Azure/GCP/STACKIT) for 1,000 employees: €20k–60k annually (paid by customer directly to hyperscaler)
- Token costs at hyperscaler list price (no mark-up): €30k–120k annually, depending on use-case mix
- Maintenance, updates, optimisation (managed service by innFactory): €24k–60k annually
- Training & change management: €20k–80k in year one
- AI officer (service or in-house): €24k–90k annually
- Compliance audit: €15k–40k annually
Year one: approx. €163k–540k, from year two approx. €113k–370k.
That typically saves €100k–250k annually at 1,000 users — and additionally gives full control over data, configuration and model access. For a company with 50–200 users, the difference is even larger in percentage terms, because SaaS licences scale linearly with user count whereas the tenant build-out is largely fixed.
Against this — with serious adoption — productivity gains of 2–5 hours per employee per week translate, at €100 fully loaded cost, into added value of €10–25 million annually. The lever is not the licence, it is adoption.
Frequently asked questions
How long does an AI rollout take? From strategy workshop to full rollout for 1,000 employees, we typically count 6–9 months. First productive use cases run after 4–6 weeks.
Do we need on-premise or is EU hosting enough? For 90 % of use cases, EU hosting with a German vendor and DPA is sufficient. On-premise is only justified for highly sensitive data (critical infrastructure, defence, medical confidentiality) or where group policies mandate it.
Do we need an AI officer? Not generally mandatory, but Art. 4 EU AI Act requires AI literacy across the organisation. In regulated sectors (banking, insurance, healthcare), a responsible person becomes effectively indispensable. We offer this as AI Officer as a Service.
Which platform fits our mid-market company? Depends on four factors: existing cloud strategy (Microsoft, Google, STACKIT), data protection needs, industry and adoption speed. For most companies from 20 users upwards, a platform in your own cloud tenant — operated on an open-source foundation — is the best path. Book a free architecture consultation, in 60 minutes we have a substantive recommendation.
At what user count does a dedicated AI platform pay off? CompanyGPT pays off from approximately 20 productive users. Below that, many companies work with ChatGPT Team or Copilot, which is enough for pure chat functionality. As soon as you need SharePoint integration, workflow automation or industry-specific customisation, your own tenant is worth it.
What is the difference between ChatGPT Enterprise and CompanyGPT? ChatGPT Enterprise is a US-hosted multi-tenant SaaS service with DPF-based data transfer and a fixed model family. CompanyGPT runs in the customer’s cloud tenant (Azure, Google Cloud or STACKIT), uses multi-LLM (Azure OpenAI, Anthropic Claude, Gemini, Llama/Mistral via STACKIT), combines chat with workflow automation (n8n) and is freely extensible via MCP. Details in our article ChatGPT for Business: Risks & GDPR Status.
Next steps
If you have read this far, you are among the decision makers approaching AI strategically. Three concrete options:
- Architecture workshop (free, 60 min): We analyse your IT landscape and give a platform recommendation — book a slot.
- EU AI Act readiness check: In four weeks you know where you stand and what to do until full applicability — view service.
- CompanyGPT demo: 30-minute live demo of the platform with your own use cases — request demo.
We are a consulting firm with tech and legal DNA — Tobias Jonas (M.Sc. AI/Cloud), Fabian Artmann (M.Eng. industrial engineering) and our advisory board with Andreas Noerr (specialist IT lawyer), Prof. Dr Sebastian Bayerl (applied AI) and Daniel Artmann (member of Bavarian parliament). More about us.
This guide is updated quarterly. Last update: 28 April 2026. Written by Tobias Jonas, Co-CEO innFactory AI Consulting GmbH.
