ChatGPT Salt Advice

AI Darwin Awards

ChatGPT Salt Advice - “The Double-Ineligibility Achievement”

Ineligible

Nominee: An unnamed 60-year-old man who trusted ChatGPT with medical dietary advice over professional healthcare guidance.

Reported by: American College of Physicians Journals case report and subsequently reported by Rachel Dobkin (The Independent) - August 7, 2025.

The Innovation

Inspired by his college nutrition studies, our nominee decided to eliminate chloride from his diet. Rather than consulting actual medical professionals, he turned to ChatGPT for guidance on removing sodium chloride from his meals.

The Catastrophe

ChatGPT recommended replacing table salt with sodium bromide—apparently confusing dietary advice with cleaning instructions. Our intrepid experimenter dutifully followed this guidance for three months, leading to bromism (bromide toxicity) complete with paranoia, hallucinations, and a three-week hospital stay.

The Double Ineligibility

Our nominee achieved the remarkable feat of being too small-scale for the AI Darwin Awards (affecting only himself rather than thousands) and too alive for the traditional Darwin Awards (having survived his spectacular poisoning adventure). He's managed to create the “Award Eligibility Event Horizon”—decisions so spectacularly poor they transcend categories of recognition, yet so non-fatal and non-systemic they qualify for absolutely nothing.

Sources: American College of Physicians Journals Case Report | The Independent: A man asked ChatGPT how to remove sodium chloride from his diet. It landed him in the hospital


Ready for More AI Disasters?

This is just one of a number of spectacular AI failures that have earned nomination in 2025, so far.