Loading...
Loading...
Browse all stories on DeepNewz
VisitOpenAI’s Whisper AI Transcription Tool in Hospitals Found to Invent Statements, Raising Concerns
Oct 26, 2024, 11:30 AM
Researchers have raised concerns about an AI-powered transcription tool used in hospitals that generates statements never made by patients or medical staff. The tool, identified as OpenAI’s Whisper, has been found to 'hallucinate,' creating fake phrases or sentences, including imaginary treatments and racial comments. This issue raises critical questions about the accuracy and reliability of AI in medical documentation and its potential impact on patient care and communication. The findings, reported by SFGATE and ABC News, highlight the need for careful evaluation and oversight of AI technologies in healthcare settings.
View original story
European Union • 25%
Federal Trade Commission (FTC) • 25%
California Consumer Privacy Act (CCPA) Authority • 25%
Other • 25%
No action taken • 25%
Warning issued • 25%
Fines imposed • 25%
Service restrictions enforced • 25%
Federal ban • 33%
State-level regulations • 33%
No significant changes • 33%
Fine • 25%
Mandate to change feature • 25%
Both fine and mandate • 25%
No action • 25%
Investigation launched • 25%
Fines imposed • 25%
No action taken • 25%
Other action taken • 25%
Yes • 50%
No • 50%
Fines • 33%
Settlements • 33%
No Action • 33%
Fine imposed • 25%
Operational restrictions • 25%
Forced divestiture • 25%
No action • 25%
Stricter content moderation • 25%
Mandatory user safety features • 25%
Increased corporate liability • 25%
Other types of regulations • 25%
Fine on another telecom company • 25%
Fine on a social media platform • 25%
New regulations on AI-generated content • 25%
No major enforcement action • 25%
No significant media trend • 25%
Increased scrutiny of AI in healthcare • 25%
Other media trend • 25%
Focus on AI accuracy improvements • 25%