Tag Archives: hallucinations

Are bad incentives to blame for AI hallucinations?

Are bad incentives to blame for AI hallucinations?

A new research paper from OpenAI asks why large language models like GPT-5 and chatbots like ChatGPT still hallucinate, and whether anything can be done to reduce those hallucinations. In a blog post summarizing the paper, OpenAI defines hallucinations as “plausible but false statements generated by language models,” and it acknowledges that despite improvements, hallucinations “remain a fundamental challenge for …

Read More »

Man who asked ChatGPT about cutting out salt from his diet was hospitalized with hallucinations

Man who asked ChatGPT about cutting out salt from his diet was hospitalized with hallucinations

A 60-year-old man spent three weeks being treated at a hospital after replacing table salt with sodium bromide following consultation with the popular artificial intelligence bot ChatGPT. Three physicians published a report on the case in the Annals of Internal Medicine earlier this month. According to the report, the man had no prior psychiatric history when he arrived at the …

Read More »

Man who asked ChatGPT about cutting out salt from his diet was hospitalized with hallucinations

Man who asked ChatGPT about cutting out salt from his diet was hospitalized with hallucinations

A 60-year-old man spent three weeks being treated at a hospital after replacing table salt with sodium bromide following consultation with the popular artificial intelligence bot ChatGPT. Three physicians published a report on the case in the Annals of Internal Medicine earlier this month. According to the report, the man had no prior psychiatric history when he arrived at the …

Read More »

Man took diet advice from ChatGPT, ended up hospitalized with hallucinations

Man took diet advice from ChatGPT, ended up hospitalized with hallucinations

A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT. A case study published Aug. 5 in the Annals of Internal Medicine, an academic journal, says the 60-year-old man decided he wanted to eliminate salt from his diet. To do so, he asked ChatGPT for an alternative to salt, or sodium …

Read More »

Man took diet advice from ChatGPT, ended up hospitalized with hallucinations

Man took diet advice from ChatGPT, ended up hospitalized with hallucinations

A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT. A case study published on Aug. 5 in the Annals of Internal Medicine, an academic journal, states the 60-year-old man decided he wanted to eliminate salt from his diet completely. To do so, he asked ChatGPT for an alternative to salt, …

Read More »

Man sought diet advice from ChatGPT and ended up with ‘bromide intoxication,’ which caused hallucinations and paranoia

Man sought diet advice from ChatGPT and ended up with ‘bromide intoxication,’ which caused hallucinations and paranoia

A man consulted ChatGPT prior to changing his diet. Three months later, after consistently sticking with that dietary change, he ended up in the emergency department with concerning new psychiatric symptoms, including paranoia and hallucinations. It turned out that the 60-year-old had bromism, a syndrome brought about by chronic overexposure to the chemical compound bromide or its close cousin bromine. …

Read More »