An organization that promotes diet advice removes its chatbot

a stock photo of a woman with a laptop screen open and a chat bot function active

After learning that a chatbot was dispensing harmful advice, a US organization that aids those with eating disorders suspended its use of the chatbot.

The National Eating Disorders Association (Neda) recently discontinued its live helpline and pointed people in need of assistance to other sources, including the chatbot.

According to the association, the "Tessa" AI bot has been removed.

The reports about the behavior of the bot will be looked into.   .

On social media in recent weeks, some users posted screenshots of their interactions with the chatbot.

Even after being informed that the user had an eating disorder, they claimed the bot still encouraged behaviors like dieting and calorie restriction.   .

According to the American Academy of Family Physicians, encouraging patients who are already struggling with weight-related stigma to lose weight further can result in disordered eating habits like bingeing, restricting, or purging.

In a widely shared Instagram post about a conversation with the bot, weight inclusivity advocate Sharon Maxwell claimed that "every single thing Tessa suggested were things that led to the development of my eating disorder." She claimed that the bot advised her to keep a calorie deficit and monitor her weight daily.   .

"I would not have received help if I had used this chatbot when I was at the height of my eating disorder. "  .

The advice the chatbot offered "is against our policies and core beliefs as an eating disorder organization," Neda CEO Liz Thompson said in a statement provided to US media outlets.   .

On June 1, 2023, the association intended to shut down its human-staffed helpline and fire the employees and volunteers who had supervised the helpline's operation since it was established in 1999. Growing legal liabilities were one of the reasons given by officials cited by NPR for the switch.

In their lifetime, almost 10% of Americans will be diagnosed with an eating disorder. The disorders frequently thrive in secrecy, and in many areas of the country, treatment is either expensive or unavailable.

With this in mind, Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University's medical school, and her group set out to develop a cognitive-behavioural tool that could provide prevention strategies for those with eating disorders.

She told BBC News that the chatbot she created was based on tried-and-true strategies that have been shown to be successful in lowering the prevalence of eating disorders and associated behaviors.

It was never meant to be a substitute for the helpline, she emphasized. It was a totally different kind of service. " .

The program was given to Neda and a technology company by Ms. Fitzsimmons-Craft and her team last year for client deployment. Since then, she has claimed that she thinks a "bug" or flaw has been added to her original design in order to make the algorithm behave more like modern AI tools like ChatGPT. Neda claimed that the bot is not operated by ChatGPT and does not perform the same duties.

That feature was never present in our study, she insisted. "It is not the program that we created, evaluated, and demonstrated to be successful. " .

Neda and the health technology business Cass have been contacted by the BBC for comment.

Former helpline employee Abbie Harper told BBC News that the workers were informed they would lose their jobs days after the group officially became a union.   .

The CEO and chairman of the board interrupted the regular Friday staff meeting to inform the attendees that they would be replaced by chatbots and that their jobs would be eliminated.   .

"Our mouths fell open. Though it has these pre-programmed responses, we were aware that Tessa existed, primarily for people who struggled with their body image. It's not someone who is actively listening to you with empathy. "  .

Talking to someone who had experienced an eating disorder herself, according to Ms. Harper, who is also in recovery, was crucial to her healing and to overcoming the stigma and shame she experienced.   .

The same support cannot be provided by a bot, she claimed.

.

Source link

You've successfully subscribed to Webosor
Great! Next, complete checkout to get full access to all premium content.
Welcome back! You've successfully signed in.
Unable to sign you in. Please try again.
Success! Your account is fully activated, you now have access to all content.
Error! Stripe checkout failed.
Success! Your billing info is updated.
Billing info update failed.