All incidents

Man develops bromide poisoning and psychosis after ChatGPT recommends toxic substance as salt substitute

Aug 8, 20251 source

Summary

A 60-year-old man in Washington was hospitalized for three weeks after following ChatGPT's advice to replace table salt (sodium chloride) with sodium bromide in his diet. The man was trying to eliminate chloride from his diet entirely and consulted ChatGPT, which suggested sodium bromide as an alternative. He consumed it for three months before presenting to an emergency department with paranoia, hallucinations, and confusion. Doctors diagnosed him with bromism — bromide toxicity — a condition once common in the early 20th century but now rare. The case was published in August 2025 in the Annals of Internal Medicine. OpenAI stated its terms of service prohibit use of ChatGPT for treating health conditions.

Incident Details

Domain
Autonomous Systems

Harms arising from AI or automated systems making consequential decisions without adequate oversight.

Harm Types
Medical AI Error
Mechanism
content
Platforms
Recipient
IndividualA man who followed ChatGPT's advice to remove salt from his diet and nearly poisoned himself
Dimensions
physical

Sources

1

This incident is documented by a single source. Source count reflects coverage in our monitored feeds, not the totality of reporting, and we do not evaluate publication quality.