6 days ago
GPs and hospitals are turning to AI scribes, so how does it work and what are the risks?
Artificial intelligence seems to be everywhere these days — so it's no surprise the innovation has made its way to the healthcare sector.
Its most recent venture? AI scribes.
From hospitals to GP clinics, the tool has been lauded by burnt-out doctors for helping to lighten to back-breaking load.
Dr Ben Condon, who now works for an Australian health technology company, first stumbled across AI during a shift in a rural Western Australian emergency department.
"I was blown away by how much it helped, in terms of giving me an extra set of hands … writing all my notes," he said.
But experts have warned there are real risks: patient information from data breaches can fetch a pretty penny on the dark web, and the technology has the potential to generate "misleading" information.
As the name suggests, an AI scribe uses software to transcribe consultations between doctors and patients in real time — a bit like a courtroom typist.
It can then generate a medical note for the doctor to review and approve to be added to a patient's medical record.
"I would easily say [it can save] hours in a shift," Dr Condon said.
"It's really tangible … in a 15-minute consultation, you're typically writing notes for 8-10 minutes, this cuts it down to a minute or less.
"Amplified across a shift it's pretty significant."
According to Queensland Health, three hospitals and health services are currently piloting AI scribes.
Queensland president of the Australian Medical Association, Dr Nick Yim, said the move was "revolutionising" the way doctors work and patients are treated.
He said it complemented the ongoing rollout of other digital innovations, like integrated electronic medical record (ieMR).
"ieMR provides the base system upon which AI tools can be added ... the rollout has been years in the making, and we welcome news it is now in place in 80 sites across the HHSs," Dr Yim said.
While experts recognise that tools to reduce the workload of overworked doctors can fill an important gap, they have also warned of the potential pitfalls.
Saeed Akhlaghpour, an associate professor at the University of Queensland and an expert in AI and health data protection, said he was cautiously optimistic about the growing use of the technology.
"There are immediate benefits [but] that said, the risks are real and I have several major concerns — they're shared by many clinicians, legal experts and privacy regulators," he said.
AI scribes are not currently regulated by the Therapeutic Goods Administration (TGA), so legal responsibility for anything generated by the software lies with the doctor, Dr Akhlaghpour added.
"These tools can make mistakes, especially with strong accents, noisy rooms or medical jargon.
A recent review by the Royal Australia College of General Practitioners noted that hallucination rates — misleading outputs generated by the AI tools — range from 0.8 per cent to nearly 4 per cent.
Dr Akhlaghpour said that can lead to errors in records and risks to patient safety.
"Unlike minor typos, these mistakes can have serious consequences if left uncorrected in a patient's health record," he said.
Recently, airline Qantas fell victim to a major data attack — the latest case in a long list of incidents where private information has been leaked online.
As hacking attempts and data breaches become more common, and the use of AI in the healthcare sector grows, how to protect patient information remains a key issue.
Dr Saeed Akhlaghpour said while there are safeguards in place, it comes down to how individual clinics and health services implement them.
"Hospitals and clinics need good internal systems: proper training, IT security, multi-factor authentication, and up-to-date privacy notices," he said.
"Choosing a reputable vendor matters too — one that stores data locally, uses encryption, and doesn't reuse patient data for AI training without consent."
Dr Akhlaghpour said he was not aware of any reported data breaches involving AI scribe technology in Australia — but the broader healthcare sector has seen serious incidents.
"That's why protecting them isn't optional — it's critical."
Before agreeing, patients should feel confident asking: