Loading...
Loading...
Browse all stories on DeepNewz
VisitOpenAI’s Whisper AI Transcription Tool in Hospitals Found to Invent Statements, Raising Concerns
Oct 26, 2024, 11:30 AM
Researchers have raised concerns about an AI-powered transcription tool used in hospitals that generates statements never made by patients or medical staff. The tool, identified as OpenAI’s Whisper, has been found to 'hallucinate,' creating fake phrases or sentences, including imaginary treatments and racial comments. This issue raises critical questions about the accuracy and reliability of AI in medical documentation and its potential impact on patient care and communication. The findings, reported by SFGATE and ABC News, highlight the need for careful evaluation and oversight of AI technologies in healthcare settings.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Ragas • 25%
Perspective • 25%
Llama Guard • 25%
Other • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
No significant media trend • 25%
Increased scrutiny of AI in healthcare • 25%
Other media trend • 25%
Focus on AI accuracy improvements • 25%
Other changes • 25%
No significant policy changes • 25%
Stricter oversight • 25%
Increased transparency requirements • 25%