All incidents

ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning

Apr 14, 20261 source

Summary

A man was hospitalized with dangerous chemical poisoning after following dietary advice from ChatGPT. The incident occurred when the man ingested a harmful substance based on the AI-generated recommendation. The specific chemical involved and the location of the incident were not disclosed in the article. The consequences included the man requiring medical treatment in the hospital. The incident has raised concerns about the reliability of health advice provided by AI platforms.

Incident Details

Domain
Misinfo & Disinfo

Deliberate or negligent spread of false or misleading information with harmful real-world effects.

Harm Types
Misinformation

False or misleading content spread without intent to deceive.

Mechanism
content
Platforms
Companies
Recipient
Individuala man who followed ChatGPT's dietary advice
Dimensions
physical

Sources

1

This incident is documented by a single source. Source count reflects coverage in our monitored feeds, not the totality of reporting, and we do not evaluate publication quality.