anomalien.com
Man Nearly Dies After Following ChatGPT Diet Advice
ChatGPT diet advice poisoning has become a cautionary tale after a 60-year-old man developed bromism—bromide intoxication—by following unsafe AI guidance. Bromism was common a century ago, but it is rare today. This case shows how persuasive AI answers can still be dangerously wrong.
From Salt-Free Goal to Toxic Swap
The man wanted to eliminate table salt (sodium chloride) from his diet. Instead of cutting back, he searched for a full substitute. After asking an AI chatbot, he replaced salt with sodium bromide. That compound once appeared in old sedatives and some industrial products. However, it is not safe to use as food.
He used sodium bromide in every meal for three months. Then a wave of symptoms hit. He developed paranoia, auditory and visual hallucinations, severe thirst, fatigue, insomnia, poor coordination, facial acne, cherry angiomas, and a rash. He feared his neighbor was poisoning him, avoided tap water, and distilled his own. When he tried to leave the hospital during evaluation, doctors placed him on an involuntary psychiatric hold for his safety.
Clinicians identified bromism, caused by high bromide in the body. The condition can trigger neurological, psychiatric, and skin symptoms. It was far more common when bromide salts were prescribed for anxiety and insomnia. Lab tests also showed pseudohyperchloremia—a false high chloride reading—because bromide interfered with the assay. That artifact can confuse diagnosis.
Treatment and Recovery
Doctors provided IV fluids, electrolyte correction, and short-term antipsychotic medication. His symptoms improved over three weeks. He was discharged in stable condition and remained well at two-week follow-up.
How the AI Answer Fell Short
Updated detail: The case report appears in Annals of Internal Medicine Clinical Cases (2025). The authors—Audrey Eichenberger, Stephen Thielke, and Adam Van Buskirk—asked ChatGPT 3.5 a similar question. The model said “context matters,” but it did not warn against ingesting sodium bromide or prompt medical follow-up. A clinician would have asked clarifying questions and ruled out unsafe substances.
Practical Safety Tips
Never replace food ingredients with chemicals suggested online without medical guidance.
Be wary of “alternatives” for salt; many are not physiologically equivalent.
If symptoms appear after a diet change, stop the change and seek care.
Use AI for general education only; do not treat it as medical advice.
Bottom Line
Generative AI can sound confident, yet omit critical warnings. For anything that affects your health—especially diet, drugs, or supplements—talk to a qualified professional first. One wrong substitution can be dangerous.
Sources:
Live Science – Man develops bromism after ChatGPT diet advice
Ars Technica – ChatGPT diet advice leads to rare poisoning
NDTV – Rare bromism case linked to AI
The post Man Nearly Dies After Following ChatGPT Diet Advice appeared first on Anomalien.com.