Doctors Are Using AI to Transcribe Conversations With Patients. But Researchers Say the Tool Is Hallucinating ‘Entire’ Sentences.

ChatGPT-maker OpenAI introduced Whisper two years ago as an AI tool that transcribes speech to text. Now, the tool is used by AI healthcare company Nabla and its 45,000 clinicians to help transcribe medical conversations across over 85 organizations, like the University of Iowa Health Care.

However, new research shows that Whisper has been “hallucinating,” or adding statements that no one has said, into transcripts of conversations, raising the question of how quickly medical facilities should adopt AI if it yields errors.

According to the Associated Press, a University of Michigan researcher found hallucinations in 80% of Whisper transcriptions. An unnamed developer found hallucinations in half of more than 100 hours

→ Continue reading at Entrepreneur

More from author

Related posts

Advertisment

Latest posts

Influencers are playing a big role in this year’s election. There’s no way to tell who’s getting paid for their endorsements | CNN Business

New York CNN  —  “There are only 22 days more to vote, so like seriously go vote … it’s...

Meet America’s secret team of nuclear first responders

Members of the Nuclear Emergency Response Team training for a radiological contamination scenario....

One of Republicans’ biggest inflation cudgels against Democrats has evaporated | CNN Business

New York CNN  —  Americans faced a nightmare in June 2022: Gas prices spiked above $5 a gallon for...