It’s a move that might please anyone concerned about the potential job-killing effects of AI tools. like BBC According to reports, the American National Eating Disorders Association (NEDA) had to remove its “Tessa” chatbot after it began recommending potentially harmful diet strategies for people with eating disorders. This happened just a week after NEDA chose to use a robot instead of a human-run live helpline. The problem with Tessa was announced by the group Instagram post, per luck. “It has come to our attention… that the current version of Tessa Chatbot… may have provided malicious information,” the post read. “We are investigating this immediately and have suspended that program until further notice for a full investigation.”
like NPR reported Wednesday, NEDA has turned to AI after running a live helpline for people with anorexia, bulimia, and other eating disorders for more than two decades. The nonprofit reportedly notified helpline employees less than a week after forming a union. NEDA said the shift had nothing to do with live employee unionization and everything to do with a surge in calls and texts to the hotline during the COVID-19 pandemic. This rise in call volume, according to NEDA’s leadership, means increased responsibility, and thus “the focus of expanded use of AI-assisted technology.”
As for Tessa’s bad behavior, CNN Reports NEDA CEO Liz Thompson blamed “bad actors” who intentionally tried to get the chatbot to provide harmful or even irrelevant advice to users. Before announcing bot problems, ex-helpline employees chirp A statement saying that chatbots cannot “substitute human empathy, and we believe this decision will cause irreparable harm to the eating disorder community.” (Read more AI stories.)
More Stories
JPMorgan expects the Fed to cut its benchmark interest rate by 100 basis points this year
Shares of AI chip giant Nvidia fall despite record $30 billion in sales
Nasdaq falls as investors await Nvidia earnings