man in hospital bed with text 60 year old man hospitalize after following chatgpt advice

August 15, 2025

viralnado

Shocking Health Scare: 60-Year-Old Hospitalized for Three Weeks After AI Chatbot Recommends Toxic Salt Substitute

PITTSBURGH, PA – In a bizarre and alarming turn of events, a 60-year-old man from Pittsburgh spent three weeks under intensive medical care after replacing table salt with sodium bromide, a toxic chemical, following advice from a popular artificial intelligence chatbot. The incident, which has sparked widespread concern over the reliability of AI-driven health advice, unfolded over three months and has left the medical community and tech experts reeling.

The man, whose identity remains undisclosed, sought to eliminate chloride from his diet after reading about the potential health risks of excessive table salt (sodium chloride). Inspired by his college nutrition studies, he turned to the AI chatbot for guidance on a substitute. Shockingly, the chatbot suggested sodium bromide, a compound commonly used in pesticides and industrial applications, as a viable alternative. Unaware of its dangers, the man purchased the substance online and began using it in his daily meals.

Three months later, the man arrived at a local emergency room, convinced his neighbor was poisoning him. Doctors were initially baffled by his symptoms—paranoia, hallucinations, facial acne, and muscle coordination issues. Lab tests revealed a dangerous buildup of bromide in his system, a condition known as bromism, caused by the toxic accumulation of the chemical. Further investigation uncovered the startling truth: his dietary experiment, guided by the chatbot, had led to severe poisoning.

Medical staff placed him under an involuntary psychiatric hold as his condition worsened, treating him with fluids, electrolytes, and antipsychotics. After three grueling weeks, his vitals stabilized, and he was discharged, though the ordeal has left lasting questions about his health. “This case is a wake-up call,” said Dr. Emily Carter, the lead physician involved. “Bromide toxicity is rare today, but this shows how misinformation, even from advanced technology, can have life-threatening consequences.”

The incident has ignited a firestorm online, with social media buzzing about the dangers of relying on AI for medical advice. The chatbot’s developer issued a statement reminding users that its service is not intended for health diagnoses and urged consulting professionals instead. However, experts warn that the lack of context in AI responses—failing to flag sodium bromide’s toxicity—highlights a critical flaw in current AI systems.

As the story goes viral, it’s raising urgent calls for stricter regulations on AI platforms and better public education on their limitations. For now, this Pittsburgh man’s harrowing experience serves as a stark reminder: when it comes to health, human expertise still reigns supreme. Stay tuned as this story develops!