Choose Facility
Warning: ChatGPT gives BAD health advice for weight loss
Beware Bad Health Advice From AI Chatbots
More and more, people are experimenting with using ChatGPT as an alternative to professional therapy. And why not? With little more than a prompt or two, ChatGPT users can receive advice to help them cope or navigate struggles in mere seconds. No appointment. No copay. Just professional-sounding advice in moments.
But is it a good idea to use ChatGPT and AI chatbots for health advice? As more AI tools become available, we must learn to determine when AI is the ideal solution for a given scenario.
Not all health advice is good for you
Earlier this year, the National Eating Disorders Association (NEDA) made headlines when its chatbot, nicknamed Tessa, gave users bad advice regarding eating disorder recovery. Though benign to most users, the chatbot’s advice – focused on minimizing sugar and calories – was harmful when presented to someone dealing with an eating disorder. Once alerted to the erroneous advice, the NEDA removed the chatbot and issued an apology on its Instagram account.
As psychologist and certified eating disorder specialist Alexis Conason pointed out, “When someone [with an eating disorder] goes on to a website like NEDA, which is supposed to provide support for eating disorders, and they are met with advice that’s kind of saying, ‘It’s OK to restrict certain foods, you should minimize your sugar intake, you should minimize the amount of calories that you’re consuming each day, you should exercise more,’ it really is giving a green light to engage in the eating disorder behaviors.”
Can robot friends replace your human therapist?
Tessa was by no means the first attempt to introduce AI into health discussions. In June, Martine Paris wrote about his experience asking the Pi AI chatbot to create a dietary plan: “With typos, glitches, and a fair share of ego, Pi ultimately provided some useful tips.” Paris received slightly better results using Google Bard and ChatGPT but still advised readers to take health advice from chatbots with caution.
Reddit users have no such hesitation. One Redditor praised ChatGPT for its superior “listening skills” compared to a human therapist, saying “In a very scary way, I feel HEARD by ChatGPT.“ Other Redditors quickly echoed a similar sentiment in the post comments, pointing out how inaccessible and unaffordable healthcare is for them.
The problem with AI therapy bots is that they can work off of faulty conclusions or miss alarming health indicators within a conversation. In a 2019 article for the Journal of American Medical Informatics Association, researchers analyzed 74 studies involving chatbots and discovered 80 safety concerns, including “incorrect or incomplete information, variation in content, and incorrect or inappropriate response to consumer needs.” Subsequent research in 2022 by different researchers concluded that ethics and human oversight must evolve and adapt to ensure the benefits of AI chatbots in healthcare are not overcome by their drawbacks.
Therapy is best when given by humans, not AI chatbots
At Oasis Eating Recovery, our team of dietitians, psychiatrists, and nurse practitioners collaborate in each patient’s journey toward a healthier and happier lifestyle. We provide the following programs and services:
- Partial Hospitalization Program (PHP)
- Intensive Outpatient Program (IOP)
- Outpatient Psychiatry
- Mental Health Services For Teens and Young Adults
If you or a loved one need professional help with managing and overcoming an eating disorder, contact us today.
All Rights Reserved © by Oasis Eating Recovery Center | Website Sitemap | Privacy Policy | Billing Policy