Image by Getty / Futurism

Doctors are increasingly using AI tools to take clinical notes, CNBC reports.

Clinical notes — and paperwork in general — take up a remarkably huge chunk of physicians' time, often leading to after-hours paperwork and organization workloads and overall contributing to feelings of exhaustion and abysmal work-life balance in the high-burnout field of healthcare. Now, a growing tide of AI-powered transcription tools are emerging as possible antidotes to doctors' ever-piling paperwork mountains, with creators and users of the programs arguing that patient experiences will ultimately improve as a result.

Per CNBC, contenders in the burgeoning marketplace include the VC-backed platforms Abridge and Suki, in addition to the Microsoft-owned Nuance's DAX Copilot, which last week announced a partnership with Stanford Health Care to deploy its tech throughout the university's medical system.

"DAX Copilot is part of our broader strategy to leverage AI to transform the health care experience for our providers and the patients we serve," said Michael Pfeffer, the chief information officer and associate dean of Stanford Health Care and Stanford School of Medicine, said in a press release. "By automating clinical documentation, we can increase efficiency while improving the quality of the clinical data captured during each encounter."

The Microsoft-backed AI "has shown significant value for our physicians who used the solution during our initial pilot deployment," Pfeffer's statement continued, adding that "patients whose providers participated in the pilot also appreciated having their doctor's full and undivided attention during clinic or telehealth visits."

Generally speaking, these AI-powered transcription tools seem to work more or less the same. Doctors press play on an app on their phones, and the program will record their conversations with patients, automatically generating clinical notes throughout.

On the one hand, given the sheer amount of paperwork that physicians are tasked with, we understand the appeal. As a previous CNBC report noted, a February Athenahealth survey reported that over 90 percent of doctors say they feel burnt out on a "regular basis," which behooves neither the healthcare system nor patients seeking treatment.

That said, this use case for AI draws some ethical questions. While transcription tools can absolutely be useful for saving some time — we use AI to transcribe audio interviews! — they don't always get things right, and their outputs still require careful fact-checking.

Case in point: a Buffalo, New York-based hematologist and oncologist named Lauren Bruckner recently told Fast Company that a generative AI-powered transcription tool had noted that a teenage cancer patient was allergic to sulfa drugs. In reality, though, the AI had misunderstood a moment in their conversation in which Bruckner had told her young patient that it was a positive thing they weren't allergic to sulfa drugs.

In other words, the AI got the interaction entirely backward. And if doctors are drowning in administrative work enough as it is, when are they going to find the time to comb through and correct their auto-transcripts?

"That doesn't happen often," Bruckner, who serves as the chief information officer at the Roswell Park Comprehensive Cancer Center, told Fast Company of the AI's concerning mistake, "but clearly that's a problem."

Considering that these tools are designed to record and keep track of confidential doctor-patient conversations, protecting patient privacy and preventing sensitive data breaches is another notable area of concern. It's also unfortunately true that doctors have already been caught attempting to use oft-incorrect AI tools to diagnose and treat patients, a growing practice riddled with practical and ethical tripwires.

But proponents of infusing AI into doctor-patient interactions are arguing that physicians' administrative burden is already costing doctors and their patients enough as it is — an argument that may ultimately say more about the state of the American health system at large than it does about a general desire to automate.

"The moment that that first document returns to you, and you see your own words and the patient's own words being reflected directly back to you in a usable fashion," Christopher Sharp, chief medical information officer at Stanford Health Care, told CNBC of his experience with the DAX Copilot, "I would say that from that moment, you're hooked."

More on AI and healthcare: Regulators Alarmed by Doctors Already Using AI to Diagnose Patients


Share This Article