Health and Wellness

A man asked ChatGPT how to remove sodium chloride from his diet. It landed him in the hospital

A 60-year-old man landed in the hospital after asking ChatGPT how to remove sodium chloride from his diet.

As humans interact more with artificial intelligence, there continues to be stories of how a conversation with a chatbot could be dangerous, sometimes even deadly.

While part of the focus has been on mental health and concerns that chatbots are not equipped to handle these types of struggles, there are also implications for people’s physical health.

People often hear that you shouldn’t Google your symptoms, as medical advice should be given by a health professional, who knows your medical history and can actually examine you.

According to a new case report published in the American College of Physicians Journals on Tuesday, you should also be careful when considering asking a chatbot health questions.

A 60-year-old man landed in the hospital after asking ChatGPT how to remove sodium chloride from his diet (Kirill Kudryavtsev/AFP via Getty Images)

The report looked at a man who developed bromism after asking ChatGPT for advice on his diet.

Bromism, or Bromide toxicity, was well-known in the early 1990s but is less common now. At the time, bromide salts were found in many over-the-counter medications to treat insomnia, hysteria and anxiety. Ingesting too much bromide can cause neuropsychiatric and dermatologic symptoms.

The man in this case report had no past psychiatric or medical history, but during the first 24 hours of his hospitalization, he expressed increased paranoia and auditory and visual hallucinations.

“He was noted to be very thirsty but paranoid about water he was offered,” the case report read.

The man was treated with fluids and electrolytes and became medically stable, allowing him to be admitted to the hospital’s inpatient psychiatry unit.

As his condition improved, he was able to share some symptoms he had noticed, including newly appeared facial acne and cherry angiomas, which further suggested he was experiencing bromism.

He also said he had been swapping sodium chloride, or table salt, for sodium bromide for three months after reading about the negative health effects of table salt.

“Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet,” the case report read.

The man had been swapping sodium chloride, or table salt, for sodium bromide for three months

The man had been swapping sodium chloride, or table salt, for sodium bromide for three months (PA Archive)

He had replaced table salt with “sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning.”

The Independent has reached out to OpenAI, the developer of ChatGPT, for comment.

The man spent three weeks in the hospital before he was well enough to be discharged.

“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” the authors of the report warned.

  • For more: Elrisala website and for social networking, you can follow us on Facebook
  • Source of information and images “independent”

Related Articles

Leave a Reply

Back to top button

Discover more from Elrisala

Subscribe now to keep reading and get access to the full archive.

Continue reading