Minute Taking and AI Transcribing Security – why minute taking training is still required
Here’s the reality everyone in the profession secretly hoped wouldn’t be true. AI hasn’t eliminated the need for traditional minute-taking. Not yet. And it doesn’t look like it will for some time. Why you may ask? The serious security concerns that we need to address around AI and commonly used minute-taking transcribing tools.
When AI transcription tools first became widely available, they seemed like the answer we’d been waiting for. Fast, accurate, affordable. Major corporates adopted them. Training providers recommended them. We all assumed that if blue chip companies were using these platforms, they must have been properly vetted.
Turns out, that assumption needs revisiting.
Recent legal developments and deeper research into how these services actually handle data have revealed a more complex picture. One that affects everyone in the minute-taking profession, from solo practitioners to corporate governance teams.
The Landscape Has Changed
AI transcription tools like Otter.ai, Fireflies.ai, and similar platforms have become standard practice across industries. Walk into any corporate office and you’ll find them in use. Marketing teams recording focus groups. HR departments transcribing interviews. Executive assistants capturing board discussions.
These weren’t rogue choices. These were deliberate decisions by professionals who assumed these well-known services had proper security measures in place.
But here’s what recent research has uncovered. Many of these platforms operate on business models that may not align with South African data protection requirements. Particularly when processing confidential information under POPIA.
And it’s not just small businesses making these choices. Blue chip corporates, organisations with entire compliance departments, have been using these exact tools for sensitive internal work.
AI transcription tools like Otter.ai, Fireflies.ai, and similar platforms have become standard practice across industries. Walk into any corporate office and you’ll find them in use. Marketing teams recording focus groups. HR departments transcribing interviews. Executive assistants capturing board discussions.
These weren’t rogue choices. These were deliberate decisions by professionals who assumed these well-known services had proper security measures in place.
But here’s what recent research has uncovered. Many of these platforms operate on business models that may not align with South African data protection requirements. Particularly when processing confidential information under POPIA.
And it’s not just small businesses making these choices. Blue chip corporates, organisations with entire compliance departments, have been using these exact tools for sensitive internal work.
The Unauthorised AI Problem Is Bigger Than You Think
Our research has found that over 60% of employees are using free AI tools without organisational authorisation. That’s not a small compliance gap. That’s a systemic issue.
Minute takers uploading board recordings to consumer AI services. Team members pasting confidential discussions into ChatGPT to generate summaries. Professionals using whatever free tool appears first in search results without understanding the data implications.
This isn’t happening because people are reckless. It’s happening because the technology arrived faster than the guidance on how to use it safely. We’re all adapting in real time to a landscape that didn’t exist five years ago.
What We’ve Learned About These Platforms
When you dig into how popular AI transcription services actually operate, some uncomfortable questions emerge.
Many platforms are based overseas. When you upload a recording, that file crosses international borders within seconds. It sits on servers in other countries. And depending on which service you’re using and which pricing tier you’re on, the terms of service may grant the vendor rights to use your content for “model improvement”.
That’s not necessarily sinister. It’s how many AI services improve their technology. But when the content is a confidential disciplinary hearing, a board meeting about restructuring, or a settlement negotiation, the question becomes whether this business model fits within POPIA’s framework.
Section 19 of POPIA requires securing personal information with appropriate safeguards. Section 72 restricts cross-border data transfers unless specific conditions are met. When you’re uploading sensitive recordings to services without verified data processing agreements, these sections become relevant.
We’re minute taking course trainers, not lawyers. But the research raised enough questions that we felt the minute-taking community deserved to understand the full picture.
The 2025 Legal Developments
In August 2025, a major AI transcription platform faced a lawsuit alleging it recorded conversations without proper consent and used that data for AI training without user permission. In December, another popular service encountered legal action regarding biometric data collection. A prominent American university banned a widely-used transcription tool entirely, citing security and privacy risks.
These are the same platforms being used across corporate South Africa right now.
The lawsuits are ongoing. We can’t predict outcomes. But they’ve highlighted questions about data handling that weren’t being asked when these tools first appeared.
The legal landscape around AI is evolving rapidly. What seemed perfectly acceptable two years ago is now being challenged in courts. What corporate legal departments approved without concern is now under closer scrutiny.
We’re all navigating this shift together.
Why Minute Takers Are Particularly Exposed
When someone uses an AI tool to transcribe a podcast episode or personal notes, the privacy stakes are different. The content is theirs to share however they choose.
Minute takers handle genuinely confidential material on behalf of clients. Employment matters, medical information, legal discussions, strategic business planning. We’re operators processing personal information under POPIA on behalf of responsible parties.
That means we carry obligations the casual AI user doesn’t. If a client later discovers their confidential board discussion was processed through an overseas service with unclear data retention policies, that’s a professional issue. Even if we made the same choice every other professional was making at the time.
The reality is that AI transcription security has become everyone’s problem. But minute takers are on the front line because of the nature of what we handle. Minute takers need training on best practices and will still require minute taking training for some time.
The Phone Recording Question
Many practitioners have asked whether recording on mobile devices is compliant. There’s confusion on this point, so let’s clarify.
Local recording on your phone or laptop is fine under POPIA if participants are notified, you obtain consent, and the file remains on your device with appropriate security.
The compliance question arises when you do something with that recording. Upload it to cloud storage. Let it sync automatically to iCloud or Google Photos. Process it through an AI service. Email it to yourself.
Most phones automatically back up to cloud services. Those files are crossing borders, being processed on offshore servers, subject to terms of service you may not have read. That’s where the Section 72 cross-border transfer question becomes relevant.
What Compliant AI Transcription Actually Requires
Compliant AI transcription is possible. But it requires specific technical and contractual safeguards that most consumer-grade services don’t provide.
You need tools that explicitly commit not to use your data for model training. You need encryption, clear data retention policies, and immediate deletion capabilities. You need processing to occur either within South Africa or under proper cross-border transfer mechanisms. You need to understand whether tools join meetings as bots or process uploaded files.
Most importantly, you need to disclose your approach to clients before accepting engagements.
The Minute Takers Clinic has researched which platforms appear to have proper security commitments and how to use them within a POPIA framework.
If you’re looking for guidance on navigating this landscape, we can help. We’ve done the research so you don’t have to piece it together yourself.
When Manual Minute-Taking Is Still The Answer
For certain contexts, particularly those involving highly sensitive content, manual minute-taking remains the safest approach. Board meetings of listed companies, disciplinary hearings involving medical information, attorney-client consultations, research with human participants.
These aren’t failures of technology. They’re recognitions that different tools suit different contexts. Sometimes the old methods exist for good reasons.
That’s actually positive news for skilled minute takers. It means there’s still meaningful professional work that requires human judgment, discretion, and expertise. The profession isn’t being automated away as quickly as some feared.
Moving Forward in an Uncertain Landscape
As we adapt to the new reality of AI and the associated security considerations, we’re all having to re-examine practices that seemed perfectly reasonable when they were adopted.
This isn’t about blame. Blue chip corporates with entire compliance teams made the same technology choices. Sixty per cent of employees are using unauthorised AI tools. This is an industry wide adaptation, not individual error.
What matters is how we respond now that we understand the landscape better.
The Minute Takers Clinic provides training on minutes according to best practice, we help minute takers understand which tools appear to have proper safeguards, how to implement compliant workflows, and when traditional methods are the safer choice.
Because in a rapidly evolving technological landscape, having expert guidance isn’t a luxury. It’s how you protect both your clients and your professional practice.
We’re here to support you through this transition. Always.
Note: This article reflects current understanding based on publicly available information about AI transcription services, POPIA requirements, and recent legal developments as of February 2026. It is provided for educational purposes and does not constitute legal advice. We encourage minute takers to seek guidance from qualified legal professionals regarding their specific compliance obligations. Technology providers’ policies and features may change, and pending litigation outcomes have not been determined.
The Minute Takers Clinic provides professional minute-taking training to support South African minute takers.

