Public AI tools like ChatGPT, Gemini, and Copilot are genuinely useful. They are fast, capable, and accessible. But for law firms and medical practices operating in South Africa, using them for client-related work is not a productivity decision — it is a liability decision.
And the liability is real.
What Happens When You Type a Client Matter Into a Public AI
When you paste a client brief, medical record, case summary, or consultation note into a public AI tool, that information leaves your environment. It gets processed by a server outside your control, potentially stored, and potentially used to train or improve the model further.
You may not be aware this is happening. The client definitely is not aware.
For law firms, this means:
- Confidential client information processed outside your control
- Potential breach of legal professional privilege
- Risk of disclosure to third parties
- Violation of your duty of confidentiality
For medical practices, this means:
- Patient health data handled by an external party
- Potential POPIA non-compliance on sensitive personal information
- Risk of data being retained, processed, or exposed abroad
- Breach of the patient’s right to confidentiality
POPIA Makes This a Compliance Issue, Not Just an Ethics Issue
South Africa’s Protection of Personal Information Act (POPIA) applies to any organisation processing personal information of South African data subjects. For legal and medical professionals, this includes client names, case details, medical conditions, treatment plans, billing information, and more.
Under POPIA, processing personal information means collecting, storing, using, transmitting, or allowing access to it. When you input client or patient data into a public AI platform, all of those things happen — often simultaneously — and you may have no visibility into how the data is retained or who can access it.
The Information Regulator has the authority to investigate, issue enforcement notices, and impose penalties of up to R10 million for serious breaches. Reputational damage is a separate, arguably more significant risk for professional services firms.
The Problem with Terms of Service
Some professionals believe they are protected because AI platforms have enterprise terms of service that restrict data use. There are several problems with this position:
- Most users are on consumer or standard plans, not enterprise agreements with data processing addenda
- Enterprise plans still process data offshore, creating cross-border transfer questions under POPIA
- Client or patient consent for their information to be processed by a third-party AI provider was almost certainly never obtained
- Terms of service change, and retrospective data policy updates have occurred across major platforms
The only truly safe position is to use an AI system where your data never leaves your environment.
What Private AI Installation Means in Practice
A private AI installation is a large language model — equally capable for drafting, summarising, researching, and analysing — that runs entirely on hardware you control. Data does not leave your network. No third party processes it. No external server receives it.
This is not a theoretical or experimental setup. It is deployable today, on hardware within your office, on a cloud instance you own, or on a private server managed by your IT provider. The models available for private installation have closed the capability gap with public platforms significantly over the past 18 months.
Company Connect installs and configures private AI environments for professional services firms in South Africa. The system is set up on your infrastructure, configured for your workflows, and secured against external access.
The result is:
- Full AI capability for drafting, review, research, and summarisation
- No data leaving your environment
- POPIA-aligned data handling
- Auditability and access control
- No ongoing per-seat licence tied to a public platform
Who This Applies To
This is not limited to large firms. Any practice handling confidential client or patient information should be assessing its AI tool usage now, before an incident forces the assessment.
Common use cases that carry risk when handled through public AI:
- Drafting and reviewing contracts using client-specific details
- Summarising affidavits, medical reports, or patient histories
- Generating correspondence that references confidential matters
- Analysing billing or financial information against client records
- Research tasks that include case-specific detail to get better answers
If any of these tasks describe how your team currently uses AI, the exposure is active.
The Practical Path Forward
The decision is not between AI and no AI. That ship has sailed and the productivity case for AI in professional services is real. The decision is between:
- Public AI, where data leaves your environment and compliance risk is ongoing
- Private AI, where capability is equivalent and your data stays under your control
For law firms and medical practices in South Africa, the second option is not a premium or an edge case. It is the appropriate baseline.
If you want to understand what a private AI installation involves for your practice — hardware requirements, cost, setup timeline, and what you can do with it — speak to Company Connect. We install and configure private AI systems built for professional services workflows in South Africa.
Learn more about our Private AI Installation service
Free Resource
Is your business losing revenue to manual processes?
Book a free 20-minute audit. We'll map your current workflows, identify the gaps, and show you exactly what automation could save you — in time and money.
Book Free Audit Call Send Us a Message