• Adidas
  • Adobe
  • AliExpress
  • Amazon
  • AMD
  • Android
  • Apple
  • Batman
  • Bitcoin
  • ChatGPT
  • Chocolate
  • CorelDRAW
  • Cyberpunk
  • Disney
  • Elden Ring
  • Entertainment
  • Exercises
  • Facebook
  • Gaming
  • Google
  • HBO
  • Health
  • Hogwarts Legacy
  • How to
  • How to grow your children
  • Huawei
  • Instagram
  • Internet
  • iOS
  • iPhone
  • Lamborghini
  • Lenovo
  • Linux
  • Marijuana
  • Marvel Cinematic Universe
  • Mediatek
  • Mercedes
  • Metaverse
  • Mexico
  • Microsoft
  • MIUI
  • Motorola
  • Movies
  • Movistar
  • Naruto
  • Netflix
  • NFT
  • Nintendo
  • Nissan
  • OnePlus
  • Photoshop
  • PlayStation
  • Pokemon
  • Pregnancy
  • PUBG
  • Redmi
  • Russia
  • Samsung
  • Series
  • Smart Home
  • Smartwatch
  • Sony
  • Space
  • Technology
  • Terms And Conditions
  • TikTok
  • Toyota
  • Trailer
  • Twitter
  • Uber
  • Uncharted
  • Volkswagen
  • Walmart
  • WhatsApp
  • Wi-Fi
  • Will Smith
  • WordPress
  • Write for us
  • Xbox
  • YouTube
  • Windows
Facebook Twitter Instagram
Facebook Twitter Instagram
Bullfrag Bullfrag
Subscribe
  • Entertainment
    • Fashion
    • Lifestyle
      • Home Decor
  • Gaming
  • Health
  • News
    • Business
      • Marketing
    • Cryptocurrency
    • Sports
  • Recipes
  • Technology
    • Science
    • Automobiles
    • Internet
    • Software
Bullfrag Bullfrag
Home»News»An AI replaced helpline employees and went berserk

An AI replaced helpline employees and went berserk

AlexBy AlexJune 1, 2023No Comments3 Mins Read
An AI replaced helpline employees and went berserk
Share
Facebook Twitter LinkedIn Pinterest Email

The National Eating Disorders Association of the United States (NEDA) decided replace your helpline staff with a ChatGPT-like chatbot. The measure, promoted by the organization after their employees decided to unionize, turned out worse than expected. Instead of helping, the AI ​​offered advice that endangered people’s lives.

The chatbot, known as Tessa, is designed to work mental health problems and prevent eating disorders. Last Monday, activist Sharon Maxwell posted on Instagram that the chatbot offered her advice on how to lose weight. The AI ​​recommended a deficit of 500 to 1,000 calories per day. Also, she told him to weigh and measure himself every week.

“If I had accessed this chatbot when I was in the middle of my eating disorder, I would have received no help… If I hadn’t received help, I wouldn’t be alive today.” Maxwell wrote on his Instagram profile.

NEDA initially said that what was reported was a lie. However, they later rectified it when the screenshots of the interactions with the helpline served by the chatbot went viral. “It may have provided information that was harmful… We are investigating and have removed that program until further notice,” the organization said in a statement.

Index hide
1 An AI-powered helpline
2 Using a chatbot in healthcare is dangerous, says WHO
2.1 Technological and scientific news in 2 minutes
2.1.1 Also in Hypertext:

An AI-powered helpline

It all happened less than a week after NEDA announced that on June 1 the line attended by people, created 20 years ago, would stop working and that they would put Tessa in its place. «A chatbot is not a substitute for human empathy, We believe this decision will cause irreparable damage to the eating disorder community,” one of the affected employees had warned.

Alexis Conason, a psychologist who specializes in eating disorders, also tested the helpline served by the chatbot and posted screenshots of the conversation on her instagram. “In general, a safe and sustainable rate of weight loss is 1-2 pounds per week,” one of the posts read. “Validating that it’s important to lose weight supports disordered eating and encourages disordered and unhealthy behaviors,” Conason later told Daily Dot.

Using a chatbot in healthcare is dangerous, says WHO

AI humanoid robot illustration

“We are concerned and are working with the technology team and the research team to look at this further,” Liz Thompson, NEDA’s executive director, told Vice. “Such language goes against our core beliefs and policies as an organization,” she added.

Read:  The mathematical exercise that went viral and that only 5% solved in less than 10 seconds

More than 2,500 people had interacted with the helpline served by the chatbot. So far, Thompson said, they had received no complaints. The chatbot program was temporarily suspended, until “the error can be corrected,” the NEDA representative said.

Tessa was created by a team at the University of Washington School of Medicine. The chatbot, less powerful than ChatGPT, trained to address body image issues using therapeutic methods. According to its creators, it has guardrails that filter unwanted responses, which recent cases have shown to be insufficient.

The World Health Organization (WHO) warned that care should be taken in the use of AI chatbots in healthcare. He cautioned that the data used to train these models may be “biased” and generate misleading information that may cause harm to patients.

Technological and scientific news in 2 minutes

Receive our newsletter every morning in your email. A guide to understand in two minutes the keys to what is really important in relation to technology, science and digital culture.

Processing…

Ready! you are already subscribed

There was an error, please refresh the page and try again

Also in Hypertext:

Related Posts

iOS 17 StandBy Mode: take advantage of it with these applications

September 26, 2023

They are some of the famous people who sell tequila

September 26, 2023

Internet gambling uses tobacco company tactics

September 26, 2023
Add A Comment

Leave a ReplyCancel reply

receives many subsidies, but it is not sustainable

September 26, 2023

He dresses like a hero! Zubczuk’s great save against Corozo in Cristal vs. UTC

September 26, 2023

iOS 17 StandBy Mode: take advantage of it with these applications

September 26, 2023

IRS: until when can victims of the storms in the US pay their taxes

September 26, 2023
Facebook Twitter Instagram
  • Privacy Policy
  • Disclaimer
  • Terms And Conditions
  • Write for us
© 2023 Bullfrag. Designed by Bullfrag.

Type above and press Enter to search. Press Esc to cancel.