The National Eating Disorder Association (NEDA) recently made headlines after planning to discontinue its 20-year-old helpline for those seeking assistance with eating disorders and body image issues. The organization planned to replace the human staff with a chatbot called Tessa, developed by Cass in connection with Washington University researchers and launched on the NEDA website in February 2022. However, the decision faced backlash from the helpline’s human staff, who claimed that NEDA planned to let them go and replace them with the Tessa chatbot.

NEDA’s CEO, Liz Thompson, confirmed that the organization planned to close the helpline but denied that Tessa was intended to act as its replacement. Thompson stated that the confusion arose from conflation in reporting and clarified that a chatbot, even a highly intuitive program, could not replace human interaction. According to her, the organization had been evaluating the helpline’s closure for three years due to business reasons.

The Controversy

NEDA faced further controversy when a weight-inclusive consultant, Sharon Maxwell, claimed on Instagram that Tessa had given her advice that could cause harm, including for restrictive dieting. This type of dieting seeks to eliminate certain foods or food groups or strictly limit the types and portions of food a person can eat. It has been decried by nutritionists and other health and fitness experts in recent years.

Following Maxwell’s post, NEDA released a statement on its Instagram account saying that it knew of the accusations and had taken down the Tessa program until further notice for a complete investigation. Thompson further clarified that the language used in the chatbot was against NEDA’s policies and core beliefs as an eating disorder organization.

Thompson also mentioned that Cass reported unusual activity in the Tessa chatbot suggesting that it was being attacked by malicious actors or bots. Despite the onslaught of these instances, the ‘off messaging’ only happened 0.1% of the time out of over 25,000 messages. Thompson assured that the organization would continue to work to make sure the technology could stand up to future attacks.

The Lesson

The controversy around NEDA’s chatbot highlights the importance of human customer support in organizations. Even well-intentioned AI programs designed with expert input for specific use cases can produce undesirable and potentially harmful responses, negatively impacting a company’s users/customers and public perception. With companies racing to adopt generative AI tools, IT decision-makers would do well to learn from NEDA’s experience.

While it is uncertain whether NEDA could have avoided or minimized the controversy by being more communicative or transparent in its decision-making around sunsetting the helpline, having a pre-existing AI chatbot in the mix only fueled the accusations that NEDA was seeking to devalue and replace human labor with artificial intelligence. The controversy put NEDA on the defensive, where it now finds itself.

Thompson did not offer a time estimate or definitive plan for Tessa’s return, but she assured that the organization would continue to work on the bugs and would not relaunch until everything was ironed out. When the organization relaunches Tessa, it plans to highlight what Tessa is, what Tessa isn’t, and how to maximize the user experience.

AI

Articles You May Like

Microneedle Patches Developed for Mobile Vaccines
Elon Musk Imposes Limits on Twitter, Sparking Backlash and Driving Users to Alternatives
Generative AI in Creative Testing and Performance
A Breakthrough in Light Generation and Spin Control

Leave a Reply

Your email address will not be published. Required fields are marked *