Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Healthcare turns to AI for medical note-taking ‘scribes’


Investment in artificial intelligence medical note-taking apps doubles in 2024, as big tech giants and start-ups including Microsoft and Amazon race to grab a slice of the $26 billion AI healthcare market.

According to data from Pitchbook, AI start-ups focused on creating digital “scribes” for health professionals will raise $800 million in 2024, up from $390 million in 2023.

Start-ups like Nabla, Heidi, Corti and Tortus have raised money in the past year supporter Khosla Ventures, along with Entrepreneur First and French tech billionaire Xavier Neill.

Funding surged as groups rushed for the launch AI-powered products It aims to make medical notes faster for doctors and improve patient interaction, as health becomes a key growth area in the AI ​​boom.

Microsoft, which owns AI speech recognition company Nuance, as well as Amazon and Oracle have launched so-called AI Co-Pilot for physicians that uses large language models and speech recognition to automatically generate transcripts of patient visits, highlight and create clinically relevant details. Clinical summary.

© Tiffany Hagler-Geard/Bloomberg

“I don’t think I’ve seen anything more transformative in 15 years of healthcare,” said South London primary care physician Harpreet Sood, who has been trialling French start-up Nabbler’s app for the past 15 months. .

Sud, a former adviser on technology and innovation to the chief executive of NHS England, said that in a full-day clinic with around 40 patients, traditional note-taking could take at least two hours of typing.

“It’s awesome, easily saved 3-4 minutes from each [10-minute] Advice and really help capture the advice and what it’s about,” he added.

Nabla’s note-taking app uses Whisper, a transcription tool from ChatGPT maker OpenAI, and has been used to transcribe nearly 7 million medical visits as of October last year.

Hospitals and general practitioners across the UK’s National Health Service are testing AI note-taking as a way to save time and improve doctor-patient interactions. According to a Mayo Clinic study, physicians spend an average of one-third of their workday on administrative tasks, such as paperwork.

Meanwhile, Microsoft said Nuance’s DAX Copilot tool, which launched just over a year ago, is now documenting more than 1.3 million physician-patient encounters per month in more than 500 U.S. healthcare groups.

Nuance, which Microsoft bought in 2022 for about $20 billion, says the AI ​​tool cuts the time physicians spend on clinical documentation by 50 percent.

© Jeff Gilbert/Alamy

At Stanford Medical School, more than 50 primary care physicians tested Nuance’s AI-powered note-taker in 2024, with two-thirds of users saying it saved time.

The AI-generated notes were closely checked by clinicians for accuracy, and the vast majority, about 90 percent, had to be edited manually to correct errors, said a person familiar with the trial.

Nevertheless, the results prompted Stanford to plan a roll out of the DAX copilot to all providers.

Sood says that when he examines every report that Nabla’s app generates, the cognitive load of simultaneously writing and listening during a consultation is greatly reduced “if not completely removed” by the tool.

“You can pay more attention to the patient, listen, be more present, understand their body language. I enjoy my advice more now,” he added.

However, the rise of medical note taking has prompted criticism from researchers about the dangers of AI-generated fabrications, known as “hallucinations”, which can be particularly harmful in a medical context, as well as questions of patient data privacy.

Researchers at Cornell University and the University of Virginia analyzed 2023 of thousands of Whisper-generated transcript snippets and found that about 1 percent of audio transcriptions contained “completely hallucinated phrases or sentences that did not exist in any form in the underlying audio”.

About 40 percent of the hallucinations included harmful content, such as perpetuating violence or wrong associations, the study said.

“I wouldn’t just rely on the tool, I would read every note and go back to the transcript,” Sood said. “There is work to be done but . . . For me personally, it’s been a big change.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *