Subscribe Now

Edit Template

Subscribe Now

Edit Template

Man Hospitalised After Following Dangerous Diet Advice From OpenAI’s ChatGPT; Doctors Warn | Technology News


New Delhi: In a rare and alarming case, a man in the United States developed life-threatening bromide poisoning after following diet advice given by ChatGPT. Doctors believe this could be the first known case of AI-linked bromide poisoning, according to a report by Gizmodo.

The case was detailed by doctors at the University of Washington in ‘Annals of Internal Medicine: Clinical Cases’. They said the man consumed sodium bromide for three months, thinking it was a safe substitute for chloride in his diet. This advice reportedly came from ChatGPT, which did not warn him about the dangers.

Bromide compounds were once used in medicines for anxiety and insomnia, but they were banned decades ago due to severe health risks. Today, bromide is mostly found in veterinary drugs and some industrial products. Human cases of bromide poisoning, also called bromism, are extremely rare.

The man first went to the emergency room believing his neighbour was poisoning him. Although some of his vitals were normal, he showed paranoia, refused water despite being thirsty, and experienced hallucinations.

His condition quickly worsened into a psychotic episode, and doctors had to place him under an involuntary psychiatric hold. After receiving intravenous fluids and antipsychotic medicines, he began to improve. Once stable, he told doctors that he had asked ChatGPT for alternatives to table salt.

The AI allegedly suggested bromide as a safe option — advice he followed without knowing it was harmful. Doctors did not have the man’s original chat records, but when they later asked ChatGPT the same question, it again mentioned bromide without warning that it was unsafe for humans.

Doctors Warn About AI’s Dangerous Health Advice

Experts say this shows how AI can provide information without proper context or awareness of health risks. The man recovered fully after three weeks in hospital and was in good health during a follow-up visit. Doctors have warned that while AI can make scientific information more accessible, it should never replace professional medical advice — and, as this case shows, it can sometimes give dangerously wrong guidance.

thecrossroadtimes.com

Writer & Blogger

Considered an invitation do introduced sufficient understood instrument it. Of decisively friendship in as collecting at. No affixed be husband ye females brother garrets proceed. Least child who seven happy yet balls young. Discovery sweetness principle discourse shameless bed one excellent. Sentiments of surrounded friendship dispatched connection is he.

Leave a Reply

Your email address will not be published. Required fields are marked *

About Me

Kapil Kumar

Founder & Editor

As a passionate explorer of the intersection between technology, art, and the natural world, I’ve embarked on a journey to unravel the fascinating connections that weave our world together. In my digital haven, you’ll find a blend of insights into cutting-edge technology, the mesmerizing realms of artificial intelligence, the expressive beauty of art.

Edit Template
As a passionate explorer of the intersection between technology, art, and the natural world, I’ve embarked on a journey to unravel the fascinating connections.
You have been successfully Subscribed! Ops! Something went wrong, please try again.

Quick Links

Home

Features

Terms & Conditions

Privacy Policy

Contact

Contact Us

© 2024 Created by Shadowbiz

As a passionate explorer of the intersection between technology, art, and the natural world, I’ve embarked on a journey to unravel the fascinating connections.
You have been successfully Subscribed! Ops! Something went wrong, please try again.

Quick Links

Home

Features

Terms & Conditions

Privacy Policy

Contact

Contact Us

© 2024 Created by Shadowbiz

Fill Your Contact Details

Fill out this form, and we’ll reach out to you through WhatsApp for further communication.

Popup Form