A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT.
A case study published Aug. 5 in the Annals of Internal Medicine, an academic journal, says the 60-year-old man decided he wanted to eliminate salt from his diet. To do so, he asked ChatGPT for an alternative to salt, or sodium chloride, to which the AI chatbot suggested sodium bromide, a compound historically used in pharmaceuticals and manufacturing.
Though the journal noted that doctors were unable to review the original AI chat logs and that the bot likely suggested the substitution for another purpose, such as cleaning, the man purchased sodium bromide and used it in place of table salt for three months.
As a result, he ended up in the hospital emergency room with paranoid delusions, despite having no history of mental health problems. Convinced that his neighbor was poisoning him, the man was reluctant to even accept water from the hospital, despite reporting extreme thirst. He continued to experience increased paranoia, as well as auditory and visual hallucinations, eventually landing him an involuntary psychiatric hold after he tried to escape during treatment.
What was happening to the man?
Doctors determined that the man was suffering from bromide toxicity, or bromism, which can result in neurological and psychiatric symptoms, as well as acne and cherry angiomas (bumps on the skin), fatigue, insomnia, subtle ataxia (clumsiness) and polydipsia (excessive thirst).
Other symptoms of bromism can include nausea and vomiting, diarrhea, tremors or seizures, drowsiness, headache, weakness, weight loss, kidney damage, respiratory failure and coma, according to iCliniq.
Bromism was once far more common because of bromide salts in everyday products. In the early 20th century, it was used in over-the-counter medications, often resulting in neuropsychiatric and dermatological symptoms, according to the study’s authors. Incidents of such poisoning saw a sharp decline when the Food and Drug Administration phased out the use of bromides in pharmaceuticals in the mid-1970s and late 1980s.
The man was treated at the hospital for three weeks, over which time his symptoms progressively improved.
A man landed in the hospital after taking dietary advice from ChatGPT.
USA TODAY reached out to OpenAI, the maker of ChatGPT, for comment on Aug. 13 but had not received a response.
The company provided Fox News Digital with a statement saying: “Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance.”
AI can ‘fuel the spread of misinformation,’ doctors say
Doctors involved in the case study said they suspected that the patient had used ChatGPT version 3.5 or 4.0, the former of which they tested in an attempt to replicate the answers the man received. The study’s authors noted they couldn’t know exactly what the man was told without the original chat log, but they did receive a suggestion for bromide as a replacement for chloride in their tests.
“Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do,” said study authors Dr. Audrey Eichenberger, Dr. Stephen Thielke and Dr. Adam Van Buskirk.
AI carries the risk of providing information without context, according to the doctors. For example, it is unlikely that a medical expert would have mentioned sodium bromide at all if a patient asked for a salt substitute.
“Thus, it is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” the study said.
This article originally appeared on USA TODAY: Man hospitalized after taking ChatGPT diet advice, study says
Source link