Close Menu
Westside People
    Facebook X (Twitter) Instagram
    Westside People
    Subscribe
    • Home
    • Top News
    • World
    • Economy
    • science
    • Tech
    • sport
    • entertainment
    • Contact Form
    Westside People
    Home»Economy»The group replaces the hotline with a chatbot, and the chatbot is withdrawn due to bad advice
    Economy

    The group replaces the hotline with a chatbot, and the chatbot is withdrawn due to bad advice

    IzerBy IzerJune 4, 2023No Comments2 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    The group replaces the hotline with a chatbot, and the chatbot is withdrawn due to bad advice
    Share
    Facebook Twitter Pinterest Reddit WhatsApp Email

    It’s a move that might please anyone concerned about the potential job-killing effects of AI tools. like BBC According to reports, the American National Eating Disorders Association (NEDA) had to remove its “Tessa” chatbot after it began recommending potentially harmful diet strategies for people with eating disorders. This happened just a week after NEDA chose to use a robot instead of a human-run live helpline. The problem with Tessa was announced by the group Instagram post, per luck. “It has come to our attention… that the current version of Tessa Chatbot… may have provided malicious information,” the post read. “We are investigating this immediately and have suspended that program until further notice for a full investigation.”

    like NPR reported Wednesday, NEDA has turned to AI after running a live helpline for people with anorexia, bulimia, and other eating disorders for more than two decades. The nonprofit reportedly notified helpline employees less than a week after forming a union. NEDA said the shift had nothing to do with live employee unionization and everything to do with a surge in calls and texts to the hotline during the COVID-19 pandemic. This rise in call volume, according to NEDA’s leadership, means increased responsibility, and thus “the focus of expanded use of AI-assisted technology.”

    As for Tessa’s bad behavior, CNN Reports NEDA CEO Liz Thompson blamed “bad actors” who intentionally tried to get the chatbot to provide harmful or even irrelevant advice to users. Before announcing bot problems, ex-helpline employees chirp A statement saying that chatbots cannot “substitute human empathy, and we believe this decision will cause irreparable harm to the eating disorder community.” (Read more AI stories.)

    Izer
    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleLive 105 returns to the Bay Area airwaves, reviving the alt-rock format
    Next Article [EN VIDÉO] Enraged husband lashes out at lecturer, accuses wife of raping him

    Related Posts

    US Justice Department Sues RealPage, Alleging It Enabled Rental Price Fixing

    August 24, 2024

    Powell in Jackson Hole: Fed to start cutting rates soon

    August 23, 2024

    Cava reports big earnings as steak launch and sales growth push stock to all-time high

    August 23, 2024

    Major Canadian freight rail traffic halted as officials struggle to keep up

    August 23, 2024

    Elon Musk Just Had to Reveal Who Owns Company X. Here’s the List

    August 22, 2024

    Stocks volatile as traders await Powell speech: Markets summary

    August 22, 2024
    Add A Comment
    Leave A Reply Cancel Reply

    Navigate
    • Home
    • Top News
    • World
    • Economy
    • science
    • Tech
    • sport
    • entertainment
    • Contact Form
    Pages
    • Home
    • Privacy Policy
    • Editorial Policy
    • DMCA
    • About Us
    Facebook X (Twitter) Instagram Pinterest
    © © 2026 WestsidePeopleMag.com. Independent stories, culture, and community coverage. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.