Українська правда

Man admitted to mental hospital after diet consultation with ChatGPT

Man admitted to mental hospital after diet consultation with ChatGPT
0

A case of psychosis caused by bromide poisoning has been recorded in the US, which a man took for three months on the recommendation of ChatGPT. This was reported by doctors from the University of Washington in the journal Annals of Internal Medicine: Clinical Cases, writes Gizmodo.

The patient went to the hospital, claiming that he had been poisoned by a neighbor. Although his physical parameters were within normal limits, the man exhibited paranoid behavior, refused to drink water when he wanted to, and had auditory and visual hallucinations. He later developed a full-blown psychosis. During this state, the man tried to escape from doctors, after which he was forcibly hospitalized in a psychiatric hospital.

Doctors suspected bromism, a rare form of bromide poisoning that has been rare since the 1980s, when the substance was removed from prescription drugs due to its toxicity. They began giving the patient intravenous fluids and an antipsychotic so that his mental state could stabilize. After the man improved, he said he had started taking sodium bromide in an attempt to reduce his sodium chloride (table salt) intake.

Unable to find clear advice in scientific sources on how to replace table salt, he turned to ChatGPT, which he said suggested replacing chloride with bromide. The man then purchased the substance online and began using it. Doctors suggest that ChatGPT may have made the recommendation without proper context, without warning about the risks.

Given the timeline of the case, the man was seeking a recommendation from ChatGPT version 3.5 or 4.0. The doctors did not have access to the patient’s chat history, but they tested ChatGPT 3.5, which did indeed mention bromide as a replacement for chloride. The AI probably had in mind technical applications, such as in the field of cleaning, but not food. At the same time, the model did not ask about the purpose of the request and did not warn about the toxicity of the substance.

The patient stabilized after treatment, was discharged three weeks later, and remained in satisfactory condition at follow-up.

Doctors stressed that while AI can be a useful tool, its advice should not replace consultation with a specialist, especially in matters related to health and safety.

Share:
Посилання скопійовано
Advert:
Advert: