Whisper's Misuse Causes Hallucinations
This is a OpenAI news story, published by New York Post, that relates primarily to Whisper news.
OpenAI news
For more OpenAI news, you can click here:
more OpenAI newsWhisper news
For more Whisper news, you can click here:
more Whisper newsemerging technologies news
For more emerging technologies news, you can click here:
more emerging technologies newsNew York Post news
For more news from New York Post, you can click here:
more news from New York PostAbout the Otherweb
Otherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like emerging technologies news, you might also like this article about
Whisper hallucinations. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest such intimate medical conversations news, consultation audio news, emerging technologies news, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
Whisper transcriptionsNew York Post
•Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said
86% Informative
Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.
But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences.
Some of the invented text can include racial commentary, violent rhetoric and imagined medical treatments.
Whisper invented a non-existent medication called “hyperactivated antibiotics” Researchers aren’t certain why Whisper and similar tools hallucinate, but software developers said the fabrications tend to occur amid pauses, background sounds or music playing.
OpenAI recommended against using Whisper in “decision-making contexts, where flaws in accuracy can lead to flaws in outcomes”.
VR Score
87
Informative language
86
Neutral language
69
Article tone
semi-formal
Language
English
Language complexity
66
Offensive language
possibly offensive
Hate speech
not hateful
Attention-grabbing headline
not detected
Known propaganda techniques
not detected
Time-value
short-lived
External references
10
Source diversity
9
Affiliate links
no affiliate links