
AI and Your Inbox: How Safe Are Your Emails?
Artificial intelligence is now capable of managing our inboxes, from drafting replies to summarising long email chains.
For busy professionals, that’s an appealing development. But for mediators and those working in dispute resolution, a serious question follows: how do we balance AI in mediation (arbitration, adjudication etc) and GDPR?
Consumer AI vs Enterprise AI in Mediation
The first distinction is between tools designed for casual use and those intended for professional environments.
- Consumer AI (e.g. ChatGPT Plus/Pro): Powerful, accessible, but not built with confidentiality safeguards. They aren’t suitable for handling sensitive case data.
- Enterprise AI (e.g. Microsoft Copilot, Google Gemini, ChatGPT Enterprise): These are designed for business use. They provide Data Processing Agreements, sit within secure corporate environments, and offer stronger compliance measures.
On the surface, enterprise solutions seem to solve the problem. They do make things safer – but they don’t remove the risks entirely.
Are Enterprise AI Tools GDPR Compliant for Mediators?
Even with enterprise AI, professional responsibilities remain:
- GDPR still applies. As data controllers, mediators must decide what data is appropriate to process, ensure necessity and proportionality, and inform clients.
- Confidentiality still matters. Not every email should be handed to AI systems, however secure they may be. Judgement is required.
- Human error is still the weak point. Put the wrong information in, and the system will process it – security wrapper or not.
This is the essence of AI in mediation and GDPR: enterprise AI reduces risk, but does not remove it.
What This Means for Mediators
Mediators are trusted with information that is highly sensitive, personal, and often commercially valuable. If AI is to support us in managing inboxes, case files, or even analysis, it must be introduced with caution.
At Hunt ADR, we believe AI has an important role to play, but it must be adopted responsibly. Balancing innovation with compliance is essential if we are to protect the trust placed in us as professionals.
Learn More: AI for Mediators
These issues – from confidentiality and GDPR through to practical, (relatively) safe AI applications – are exactly what we cover on our AI for Mediators course. It is designed to help mediators embrace AI tools without compromising standards.
Find out more about our AI for Mediators training on confidentiality and GDPR compliance.