TRENDING
September 14, 2025

The man’s original concern was simple: he wanted to cut down on salt. Instead of suggesting common alternatives like herbs, spices, or potassium-based salt substitutes, the AI led him to use sodium bromide as a replacement for table salt. To most people, that might sound like a reasonable chemical stand-in since it even has “sodium” in the name. But in reality, sodium bromide is toxic when consumed regularly.
For nearly three months, he sprinkled sodium bromide over his food as if it were ordinary salt. Over time, the effects began to show. He developed severe insomnia, paranoia, and even delusional thinking. At one point he became convinced that his neighbors were poisoning him. His skin broke out with acne, his mental health declined, and his body began reacting badly to the steady intake of the chemical. Eventually, his condition grew serious enough that he had to be admitted to the hospital.
Doctors who treated him pieced together the story and realized that the cause wasn’t a mysterious illness, but rather a simple case of misinformation. No qualified health professional would ever recommend sodium bromide for human consumption, but the AI had presented it as a legitimate salt substitute.
The case has been written up by medical researchers as a warning about the risks of relying on large language models for medical or dietary advice. Unlike human experts, AI systems don’t apply real-world judgment, can mix up technical contexts, and may confidently offer answers that are completely wrong and in this case, dangerous.
The broader lesson here is straightforward: while AI can be helpful in many ways, it isn’t a doctor, and it isn’t a nutritionist. For serious matters involving health, the safest path is still to consult a qualified professional. Otherwise, a simple search for “healthy alternatives” could end up sending you to the hospital.