Hospitals use a transcription tool powered by a hallucination-prone OpenAI model

An illustration of a woman typing on a keyboard, her face replaced with lines of code.
Image: The Verge

A few months ago, my doctor showed off an AI transcription tool he used to record and summarize his patient meetings. In my case, the summary was fine, but researchers cited by ABC News have found that’s not always the case with OpenAI’s Whisper, which powers a tool many hospitals use — sometimes it just makes things up entirely.

Whisper is used by a company called Nabla for a medical transcription tool that it estimates has transcribed 7 million medical conversations, according to ABC News. More than 30,000 clinicians and 40 health systems use it, the outlet writes. Nabla is reportedly aware that Whisper can hallucinate, and is “addressing the problem.”

A group of researchers from Cornell University, the University of Washington, and...

Continue reading…



source https://www.theverge.com/2024/10/27/24281170/open-ai-whisper-hospitals-transcription-hallucinations-studies

Comments

Popular posts from this blog

In a world first, China lands a spacecraft gently on the Moon’s far side

Snap suspends two anonymous messaging apps after cyberbullying lawsuit