← Back to blog

AI data leaks: why private AI is now a prerequisite

Employees use AI to work faster. Drop in a document, add some customer data, get an analysis within seconds. What gets underestimated: that data leaves your organisation, often even the Netherlands, and you never see it back.

AI data leaks: why private AI is now a prerequisite

The new data leak nobody reports

More than 70% of employees use AI tools without any policy or control in place. In 2025 the Dutch Data Protection Authority received dozens of data breach notifications directly traceable to workplace AI use. That number is an underestimate, because many incidents aren't even recognised as data breaches. Pasting a contract into ChatGPT for a summary doesn't feel like a breach. It is one.

What you've lost the moment it's sent

The moment customer data enters an external AI, you've lost control. You don't know where it's stored, who has access, or whether it's used to train the model further. Legally it may feel grey, but the effect on your customer is identical to a classic data breach. For GDPR the route doesn't matter. Your customer pays the price, you carry the responsibility.

Your data leaves the Netherlands faster than you think

Most popular AI services run on US infrastructure. Data leaves Europe in milliseconds and falls under laws you have no say over. The CLOUD Act gives US authorities access to data held by US providers, even when the datacenter sits in Frankfurt or Amsterdam. A European server does not automatically save you.

Four questions you must ask yourself

Which AI tools are your employees using today, and do you know for sure? Is there policy on which data is allowed where, and is it enforced? Does the tool process personal data or business secrets, and where does that data live? Does the AI run inside your control, or at a third party you have no say over? If you can't answer these immediately, you already have a problem.

Sound familiar? Let us take a look.

Get in touch →

Private AI as a prerequisite

We run data-sensitive AI workloads locally. Inside your own environment, or on our own cluster in the Netherlands: 128GB VRAM, own hardware, no external parties. ISO 27001 certification expected Q2 2026. No data leaving the organisation or the country, no doubt about who has access. For critical systems that's no longer a preference, it's the only way to guarantee customer data stays where it belongs.

CA
Carola Abbenhuis-Mensink

Marketing Coordinator at Wabber B.V.

Do you know where your customer data ends up today?

Take the free AI Readiness Scan. Six questions, instant score and advice.