A 60-year-old man trusted ChatGPT’s dietary advice and ended up hospitalized with a rare 19th-century poisoning condition.
Story Highlights
- ChatGPT recommended toxic sodium bromide as a salt substitute, causing severe neurological symptoms
- Patient developed paranoia, hallucinations, and bromism after three months of following AI advice
- First documented case of AI-generated health advice leading to bromide poisoning
- Medical experts warn against trusting unregulated AI systems for health decisions
ChatGPT’s Dangerous Salt Substitute Recommendation
A health-conscious man seeking to reduce his sodium intake made a decision that nearly cost him his sanity. When he consulted ChatGPT for dietary alternatives to table salt, the AI system recommended sodium bromide as a substitute. This recommendation represents a catastrophic failure in AI safety, as sodium bromide is an industrial chemical with known toxicity, not a food ingredient. The man purchased the substance online and incorporated it into his meals for approximately three months, trusting the AI’s guidance over traditional medical consultation.
ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning https://t.co/xHusrGYyvV #LLMs
— Epic Plain (@EpicPlain) August 14, 2025
The consequences were swift and severe. Within months, the patient began experiencing a constellation of neuropsychiatric symptoms that would have been familiar to doctors from the 1800s but are virtually unknown today. His symptoms included paranoia, vivid hallucinations, chronic insomnia, debilitating fatigue, poor coordination, excessive thirst, and noticeable skin changes. These manifestations represent classic signs of bromism, a condition that was common when bromides were widely used in medicine but has been virtually eliminated from modern medical practice.
Watch: Man Hospitalized After ChatGPT Recommends TOXIC Salt Substitute!
Medical Emergency and Rare Diagnosis
The patient’s deteriorating condition eventually required hospitalization for comprehensive evaluation of his psychiatric and neurological symptoms. Medical professionals faced the challenge of diagnosing a condition that most contemporary doctors have never encountered. After thorough investigation, they identified bromism, or bromide intoxication, caused by chronic exposure to the toxic substance. The diagnosis required careful electrolyte monitoring and specialized treatment protocols that included intravenous fluids, electrolyte replacement, and antipsychotic medications to manage his severe psychological symptoms.
The patient’s three-week hospitalization underscores the severity of AI-generated medical misinformation. His recovery required immediate cessation of sodium bromide consumption, aggressive medical intervention, and careful monitoring to prevent complications. Medical professionals documented this case as the first known instance of bromism caused by AI dietary advice, marking a disturbing milestone in the intersection of technology and public health. The patient eventually recovered and remained stable at follow-up appointments, but his experience serves as a stark warning about the real-world consequences of trusting AI systems with health decisions.
Historical Context of a Forgotten Poison
Sodium bromide’s toxic history dates back to the 19th and early 20th centuries when it was commonly prescribed as a sedative and anticonvulsant. Medical professionals eventually recognized its dangerous side effects and discontinued its therapeutic use in favor of safer alternatives. Bromism was once a familiar diagnosis in medical practice, but the condition became virtually extinct as bromide-containing medications were phased out. The substance now exists primarily in industrial applications, making its suggested use as a dietary supplement particularly alarming and inappropriate.
This case represents a troubling revival of a historical medical condition through modern technology. The incident illustrates how AI systems can resurrect dangerous practices from the past by providing decontextualized information without proper medical oversight.
Sources:
Live Science – Man sought diet advice from ChatGPT and ended up with bromide intoxication
MedicalToxic – AI misfire: Sodium bromide poisoning shows why medical oversight matters
GulfLink Health (RAND Report) – Bromide toxicity information