
Healthcare’s Hidden ADM Problem: The December 2026 Deadline No Australian Clinic Can Ignore
Healthcare’s Hidden ADM Problem: The December 2026 Deadline No Australian Clinic Can Ignore
There’s a feature of Australia’s Privacy Act that healthcare professionals routinely overlook, and it makes the 10 December 2026 commencement of new automated decision-making transparency rules a much bigger problem for the sector than it is for any other.
Most Privacy Act obligations apply to organisations with annual turnover above three million dollars. Healthcare doesn’t get that exemption. Health service providers are APP entities regardless of turnover. A solo GP practice, a single-clinician psychology practice, a small allied health provider, all of them are subject to the same transparency obligations as a state health service or a major private hospital group.
That’s the first thing every healthcare practice in Australia needs to internalise. There is no small-business carve-out from the December deadline for healthcare.
Where ADM is happening in Australian healthcare
The shift over the past three years has been substantial. Healthcare in Australia today runs on algorithms in a way it didn’t even at the start of the pandemic.
Triage and prioritisation tools are perhaps the most visible. Many emergency departments and increasingly many GP practices use computational triage support to assess urgency. These systems take symptom data, demographic information, sometimes prior medical history, and they output a priority classification or a recommended care pathway. That’s a computer program using personal information to make a decision that significantly affects an individual’s interests. The decision affects care timing, sometimes care quality, occasionally care outcomes.
Clinical decision support is another. AI-assisted diagnostic tools are now routine in radiology, pathology, dermatology, and ophthalmology. Even where the final clinical decision rests with a clinician, the system substantially and directly contributes to the decision in ways that satisfy the legislative trigger.
AI scribes have proliferated. Most of the major scribe products now offer some form of automated summarisation, coding suggestion, or follow-up recommendation. Once a system is recommending diagnostic codes that affect billing or proposing follow-up actions that affect care, it has crossed into the territory the new APP 1.7 requirements address.
Mental health risk flagging systems, increasingly common in both primary care and acute settings, raise especially sensitive issues. A flag that triggers an intervention, a referral, or a notification has very real consequences for a patient’s autonomy and care experience.
Then there are the back-office systems. Appointment prioritisation algorithms. Insurance claim adjudication on the private side. Medicare and DVA eligibility tooling. Automated chronic disease registry management. Each one of these may use personal health information to contribute to decisions that affect a patient’s access to care.
And, as everywhere else, the Microsoft 365 layer is now a significant source of ADM. Copilot Studio agents in healthcare environments handle referral routing, appointment management, billing queries, patient communication. Many of them touch personal health information. Many of them contribute to decisions about care access or care timing.

The sensitivity multiplier
Personal health information is a special category of personal information under the Privacy Act. The transparency obligation under APP 1.7 doesn’t change that, the new rules are about disclosure, not restriction, but the underlying sensitivity makes the disclosure work more important and the consequences of non-compliance more serious.
A non-compliant privacy policy in healthcare doesn’t just expose the practice to OAIC infringement notices. It potentially exposes the practice to AHPRA-related professional consequences, contractual issues with state health departments or private health insurers, and reputational damage that’s difficult to undo in a sector that runs on trust.
What clinics actually need to do
The practical first step is an inventory. For most clinics, this means walking through every clinical and administrative workflow and asking: is there any point at which a computer program uses information about an individual patient to make or contribute to a decision affecting that patient’s care, access, or treatment?
The honest answer in 2026 is: yes, in more places than the average practice realises. The triage tool counts. The AI scribe counts. The diagnostic support overlay in the imaging system counts. The Copilot agent that auto-prioritises GP follow-up calls counts.
The second step is to understand what each system actually does with personal information, what it takes in, what it produces, and how the output influences decisions about the patient. This is where healthcare practices typically have the largest gap. Many of these systems are SaaS, and the vendor disclosure documents are often more about marketing than about technical clarity.
The third step is to update the privacy policy with the disclosures the legislation requires. The kinds of personal information used in ADM. The kinds of decisions made wholly by ADM. The kinds of decisions where ADM substantially and directly contributes.
This is not a one-page addendum job. For most healthcare practices using current-generation AI tooling, it’s a serious privacy policy rewrite, and it needs to be supported by an underlying inventory that the practice can actually defend if asked.

Where this gets harder
Two specific issues make healthcare’s December 2026 readiness work harder than the equivalent in other sectors.
The first is vendor opacity. Many of the AI tools in clinical use are sold by vendors who themselves don’t fully document what their models do or what data they use. The practice carries the disclosure obligation, but the technical knowledge often sits with the vendor. Practices need to start pushing vendors for the documentation needed to support compliant disclosure now, not in November.
The second is the workflow change implication. A privacy policy that says we use AI to flag mental health risk forces the practice to think about whether the disclosure also implies a need for clearer consent processes, opt-out mechanisms, or human-in-the-loop review checkpoints. The transparency obligation doesn’t directly require those things, but a thoughtful disclosure exercise often surfaces them as worth addressing in their own right.
Healthcare has roughly seven months to do this work properly. That’s enough time. It’s not enough time for the practices that wait until October.
Jan Davids Principal Consultant, Aureus Solutions Microsoft AI Cloud Partner | Adelaide, SA
Sources
1. Privacy and Other Legislation Amendment Act 2024 (Cth). Federal Register of Legislation. https://www.legislation.gov.au/C2024A00128/asmade
2. Office of the Australian Information Commissioner, Chapter 1: APP 1- Open and transparent management of personal information (APP Guidelines). https://www.oaic.gov.au/privacy/australian-privacy-principles/australian-privacy-principles-guidelines/chapter-1-app-1-open-and-transparent-management-of-personal-information
3. Macpherson Kelley, Automated Decision-Making: Current privacy obligations and what’s in the pipeline for 2026. https://mk.com.au/automated-decision-making-current-privacy-obligations-and-whats-in-the-pipeline-for-2026/
4. JacMac Lawyers, Complying with the new transparency requirements for automated decision making. https://www.jacmac.com.au/insights/complying-with-the-new-transparency-requirements-for-automated-decision-making/
5. Norton Rose Fulbright, Australian Privacy Alert: Parliament passes major and meaningful privacy law reform (December 2024). https://www.nortonrosefulbright.com/en/knowledge/publications/be98b0ff/australian-privacy-alert-parliament-passes-major-and-meaningful-privacy-law-reform
6. MinterEllison, OAIC ramps up privacy enforcement: are you ready? (February 2026). https://www.minterellison.com/articles/oaic-ramps-up-privacy-enforcement-are-you-ready
Insights & Updates
Explore articles, resources, and ideas where we share updates about the product, thoughts on technology, and lessons learned while building along the way.
Insights & Updates
Explore articles, resources, and ideas where we share updates about the product.

