Technophile NewsTechnophile News
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
What's On

The Instagram iPad App Is Finally Here

3 September 2025

Instagram is coming to iPad, 15 years later

3 September 2025

The Best Smokeless Firepits for S’mores Without Tears

3 September 2025

Microsoft’s PowerToys are about to add two big missing Windows features

3 September 2025

The Best Action Cameras

3 September 2025
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact Us
Wednesday, September 3
Facebook X (Twitter) Instagram YouTube
Technophile NewsTechnophile News
Demo
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
Technophile NewsTechnophile News
Home » ChatGPT’s Health Advice Sends 60-Year-Old Man to the Hospital, Raises Questions on Its Reliability
Android

ChatGPT’s Health Advice Sends 60-Year-Old Man to the Hospital, Raises Questions on Its Reliability

By News Room11 August 20253 Mins Read
Facebook Twitter Pinterest LinkedIn Telegram Tumblr Reddit WhatsApp Email
Share
Facebook Twitter LinkedIn Pinterest Email

ChatGPT’s health advice was the reason behind a man’s trip to the hospital, as per a new case study. The study highlights that a 60-year-old person was suffering from rare metal poisoning, which resulted in a range of symptoms, including psychosis. The study also mentions that the poisoning, identified as being caused by long-term sodium bromide consumption, occurred because the patient took advice from ChatGPT about dietary changes. Interestingly, with GPT-5, OpenAI is now focusing on health-related responses from the artificial intelligence (AI) chatbot, promoting it as a key feature.

ChatGPT Said to Have Asked a Man to Replace Table Salt With Sodium Bromide

According to an Annals of Internal Medicine Clinical Cases report titled “A Case of Bromism Influenced by Use of Artificial Intelligence,” a person developed bromism after consulting the AI chatbot ChatGPT for health information.

The patient, a 60-year-old man with no past psychiatric or medical history, was admitted to the emergency room, concerned that he was being poisoned by his neighbour, the case study stated. He suffered from paranoia, hallucinations and suspicion of water despite being thirsty, insomnia, fatigue, issues with muscle coordination (ataxia), and skin changes, including acne and cherry angiomas.

After immediate sedation and running a series of tests, including consultation with the Poison Control Department, the medical professionals were able to diagnose the condition as bromism. This syndrome occurs after long-term consumption of sodium bromide (or any bromide salt).

According to the case study, the patient reported consulting ChatGPT to replace sodium chloride in his diet, and after receiving sodium bromide as an alternative, he began consuming it regularly for three months.

The study claims, based on the undisclosed timeline of the case, that either GPT-3.5 or GPT-4 was used to receive the consultation. However, the researchers note that they did not have access to the conversation log, so it is not possible to assess the prompt and response from the AI. It is likely that the man took ChatGPT’s answer out of context.

“However, when we asked ChatGPT 3.5 what chloride can be replaced with, we also produced a response that included bromide. Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do,” the study added.

Live Science reached out to OpenAI for a comment. A company spokesperson reported directed the publication was directed to the company’s terms of use, which state that one should not rely on output from ChatGPT as a “sole source of truth or factual information, or as a substitute for professional advice.

After prompt action and a treatment that lasted three weeks, the study claimed that the person began displaying improvements. “It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” the researchers said.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

Zomato Gold Membership Renewal Price Drops to One Rupee for Some Customers

12 August 2025

Pika Labs Launches Social AI Video App on iOS, Unveils New Audio-Driven Video Generation AI Model

12 August 2025

Anthropic Brings New Claude Feature That Will Let the Chatbot Refer to Past Conversations

12 August 2025

Musk Says xAI to Take Legal Action Against Apple Over App Store Rankings

12 August 2025

HTC Wildfire E4 Plus – Price in India, Specifications (12th August 2025)

12 August 2025

Microsoft Lens App to Be Retired at the End of This Year, Company Suggests Users Switch to Copilot

11 August 2025
Top Articles

iPhone 17 Air Colour Options Hinted in New Leak; Could Launch in Four Shades

10 July 202570 Views

Huawei Pura 80 Series Launch Date Set for June 11; Key Camera Specifications Leaked

4 June 202560 Views

Vivo X Fold 5 Colour Options, Specifications Teased Ahead of India Launch

2 July 202553 Views
Stay In Touch
  • Facebook
  • YouTube
  • TikTok
  • WhatsApp
  • Twitter
  • Instagram
Don't Miss

Why I love my Kenmore canister vacuum

3 September 2025

Emma Roth is a news writer here at The Verge, “and for me,” she explains,…

How to Switch From iPhone to Android

3 September 2025

The UK’s largest energy supplier has created its own EV charger

3 September 2025

Dolby Atmos FlexConnect Lets You Place Speakers Anywhere

3 September 2025
Technophile News
Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2025 Technophile News. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.