Health and Wellness

Man accidentally poisons himself after following diet recommended by ChatGPT that made him hallucinate

A man in Washington accidentally poisoned himself after following a diet made by ChatGPT.

The unnamed man, 60, rushed to his local emergency room with suspicions that his neighbor was poisoning him.    

About a day after being admitted to the hospital, he also suffered paranoia and hallucinations and attempted to escape from the hospital. 

The man later revealed he had several dietary restrictions, including distilling his own water and following an ‘extremely restrictive’ vegetarian diet. 

He told doctors after reading about the harms of sodium chloride, or table salt, he asked ChatGPT about eliminating it from his diet. 

The chatbot reportedly advised him it was safe to replace salt with sodium bromide, which was used as a sedative in the early 20th century and is now found in anticonvulsants for dogs and humans. 

He ended up following this recommendation for three months and eventually developed bromism, or bromide poisoning. 

Bromide can accumulate in the body and impair nerve function, a condition called bromism. This leads to confusion, memory loss, anxiety, delusions, rashes and acne, which the man also had. 

A man in Washington gave himself bromide poisoning by following recommendations from ChatGPT (stock image)

Doctors treating the man, from the University of Washington in Seattle, replicated his search and got the same incorrect advice.

They warned that the case highlighted ‘how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes.’

They said ChatGPT and other chatbots could ‘generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.’ 

The anonymous case study, published earlier this month in the Annals of Internal Medicine, comes one week after OpenAI claimed one of the chatbot’s newest upgrades could be better at answering health-related questions and ‘flagging potential concerns.’

However, ChatGPT’s guidelines state it is not ‘intended for use in the diagnosis or treatment of any health condition.’

The patient appeared to have an earlier version of the software. 

After attempting to escape from the hospital, the man was put on an involuntary psychiatric hold and given large amounts of fluids and electrolytes to help flush the bromide out of his system. 

His bromide level was at 1,700 mg/L, while the normal range is between 0.9 and 7.3 mg/L. 

Bromide was used as a sedative in the 19th and 20th centuries and was once widespread in prescription and over-the-counter drugs. However, as research uncovered the risk of chronic exposure, regulators gradually began removing them from the US drug supply. 

As a result, cases today remain few and far between. 

The man also reported distilling his own water and maintaining an 'extremely restrictive' vegetarian diet (stock image)

The man also reported distilling his own water and maintaining an ‘extremely restrictive’ vegetarian diet (stock image)

The case study comes after OpenAI claimed one of the chatbot's newest upgrades could be better at answering health-related questions and 'flagging potential concerns'

The case study comes after OpenAI claimed one of the chatbot’s newest upgrades could be better at answering health-related questions and ‘flagging potential concerns’

The man reported acne and small red growths on his skin, insomnia, fatigue, muscle coordination issues and excessive thirst. 

It took three weeks for his bromide levels to stabilize and for him to be weaned off psychiatric medications before he was able to be discharged. 

The doctors treating him wrote: ‘While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.

‘It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.’

They also emphasized that ‘as the use of AI tools increases, providers will need to consider this when screening for where their patients are consuming health information.’ 

  • For more: Elrisala website and for social networking, you can follow us on Facebook
  • Source of information and images “dailymail

Related Articles

Leave a Reply

Back to top button

Discover more from Elrisala

Subscribe now to keep reading and get access to the full archive.

Continue reading