Bringing The World Home To You

© 2024 WUNC North Carolina Public Radio
120 Friday Center Dr
Chapel Hill, NC 27517
919.445.9150 | 800.962.9862
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

National Eating Disorders Association phases out human helpline, pivots to chatbot

Abbie Harper worked for a helpline run by the National Eating Disorders Association (NEDA), which is now being phased out. Harper disagrees with the new plan to use an online chatbot to help users find information about eating disorders.
Andrew Tate
Abbie Harper worked for a helpline run by the National Eating Disorders Association (NEDA), which is now being phased out. Harper disagrees with the new plan to use an online chatbot to help users find information about eating disorders.

For more than 20 years, the National Eating Disorders Association (NEDA) has operated a phone line and online platform for people seeking help with anorexia, bulimia, and other eating disorders. Last year, nearly 70,000 individuals used the helpline.

NEDA shuttered that service in May. Instead, the non-profit will use a chatbot called Tessa that was designed by eating disorder experts, with funding from NEDA.

(When NPR first aired a radio story about this on May 24, Tessa was up and running online. But since then, both the chatbot's page and a NEDA article about Tessa have been taken down. When asked why, a NEDA official said the bot is being "updated," and the latest "version of the current program [will be] available soon.")

Paid staffers and volunteers for the NEDA hotline expressed shock and sadness at the decision, saying it could further isolate the thousands of people who use the helpline when they feel they have nowhere else to turn.

"These young kids...don't feel comfortable coming to their friends or their family or anybody about this," says Katy Meta, a 20-year-old college student who has volunteered for the helpline. "A lot of these individuals come on multiple times because they have no other outlet to talk with anybody...That's all they have, is the chat line."

The decision is part of a larger trend: many mental health organizations and companies are struggling to provide services and care in response to a sharp escalation in demand, and some are turning to chatbots and AI, despite the fact that clinicians are still trying to figure out how to effectively deploy them, and for what conditions.

The research team that developed Tessa has published studies showing it can help users improve their body image. But they've also released studies showing the chatbot may miss red flags (like users saying they plan to starve themselves) and could even inadvertently reinforce harmful behavior.

More demands on the helpline increased stresses at NEDA

On March 31, NEDA notified the helpline's five staffers that they would be laid off in June, just days after the workers formally notified their employer that they had formed a union. "We will, subject to the terms of our legal responsibilities, [be] beginning to wind down the helpline as currently operating," NEDA board chair Geoff Craddock told helpline staff on a call March 31. NPR obtained audio of the call. "With a transition to Tessa, the AI-assisted technology, expected around June 1."

NEDA's leadership denies the helpline decision had anything to do with the unionization, but told NPR it became necessary after the COVID-19 pandemic, when eating disorders surged and the number of calls, texts and messages to the helpline more than doubled. Many of those reaching out were suicidal, dealing with abuse, or experiencing some kind of medical emergency. NEDA's leadership contends the helpline wasn't designed to handle those types of situations.

The increase in crisis-level calls also raises NEDA's legal liability, managers explained in an email sent March 31 to current and former volunteers, informing them the helpline was ending and that NEDA would "begin to pivot to the expanded use of AI-assisted technology."

"What has really changed in the landscape are the federal and state requirements for mandated reporting for mental and physical health issues (self-harm, suicidality, child abuse)," according to the email, which NPR obtained. "NEDA is now considered a mandated reporter and that hits our risk profile---changing our training and daily work processes and driving up our insurance premiums. We are not a crisis line; we are a referral center and information provider."

COVID created a "perfect storm" for eating disorders

When it was time for a volunteer shift on the helpline, Meta usually logged in from her dorm room at Dickinson College in Pennsylvania. During a video interview with NPR, the room appeared cozy and warm, with twinkly lights strung across the walls, and a striped crochet quilt on the bed.

Meta recalls a recent conversation on the helpline's messaging platform with a girl who said she was 11. The girl said she had just confessed to her parents that she was struggling with an eating disorder, but the conversation had gone badly.

"The parents said that they 'didn't believe in eating disorders,' and [told their daughter] 'You just need to eat more. You need to stop doing this,'" Meta recalls. "This individual was also suicidal and exhibited traits of self-harm as well...it was just really heartbreaking to see."

Eating disorders are a common, serious, and sometimes fatal illness. An estimated nine percent of Americans experience an eating disorder during their lifetime. Eating disorders also have some of the highest mortality rates among mental illnesses, with an estimated death toll of more than 10,000 Americans each year.

But after the COVID-19 pandemic hit, closing schools and forcing people into prolonged isolation, crisis calls and messages like the one Meta describes became far more frequent on the helpline. That's because the pandemic created a "perfect storm" for eating disorders, according to Dr. Dasha Nicholls, a psychiatrist and eating disorder researcher at Imperial College London.

In the U.S., the rate of pediatric hospitalizations and ER visits surged. For many people, the stress, isolation and anxiety of the pandemic was compounded by major changes to their eating and exercise habits, not to mention their daily routines.

On the NEDA helpline, the volume of contacts increased by more than 100% compared to pre-pandemic levels. And workers taking those calls and messages were witnessing the escalating stress and symptoms in real time.

"Eating disorders thrive in isolation, so COVID and shelter-in-place was a tough time for a lot of folks struggling," explains Abbie Harper, a helpline staff associate. "And what we saw on the rise was kind of more crisis-type calls, with suicide, self-harm, and then child abuse or child neglect, just due to kids having to be at home all the time, sometimes with not-so-supportive folks."

There was another 11-year-old girl, this one in Greece, who said she was terrified to talk to her parents "because she thought she might get in trouble" for having an eating disorder, recalls volunteer Nicole Rivers. On the helpline, the girl found reassurance that her illness "was not her fault."

"We were actually able to educate her about what eating disorders are," Rivers says. "And that there are ways that she could teach her parents about this as well, so that they may be able to help support her and get her support from other professionals."

What personal contact can provide

Because many volunteers have successfully battled eating disorders themselves, they're uniquely attuned to experiences of those reaching out, Harper says. "Part of what can be very powerful in eating disorder recovery, is connecting to folks who have a lived experience. When you know what it's been like for you, and you know that feeling, you can connect with others over that."

Until a few weeks ago, the helpline was run by just 5-6 paid staffers, two supervisors, and depended on a rotating roster of 90-165 volunteers at any given time, according to NEDA.

Yet even after lockdowns ended, NEDA's helpline volume remained elevated above pre-pandemic levels, and the cases continued to be clinically severe. Staff felt overwhelmed, undersupported, and increasingly burned out, and turnover increased, according to multiple interviews with helpline staffers.

The helpline staff formally notified NEDA that their unionization vote had been certified on March 27. Four days later, they learned their positions were being eliminated.

It was no longer possible for NEDA to continue operating the helpline, says Lauren Smolar, NEDA's Vice President of Mission and Education.

"Our volunteers are volunteers," Smolar says. "They're not professionals. They don't have crisis training. And we really can't accept that kind of responsibility." Instead, she says, people seeking crisis help should be reaching out to resources like 988, a 24/7 suicide and crisis hotline that connects people with trained counselors.

The surge in volume also meant the helpline was unable to respond immediately to 46% of initial contacts, and it could take between 6 and 11 days to respond to messages.

"And that's frankly unacceptable in 2023, for people to have to wait a week or more to receive the information that they need, the specialized treatment options that they need," she says.

After learning in the March 31 email that the helpline would be phased out, volunteer Faith Fischetti, 22, tried the chatbot out on her own. "I asked it a few questions that I've experienced, and that I know people ask when they want to know things and need some help," says Fischetti, who will begin pursuing a master's in social work in the fall. But her interactions with Tessa were not reassuring: "[The bot] gave links and resources that were completely unrelated" to her questions.

Fischetti's biggest worry is that someone coming to the NEDA site for help will leave because they "feel that they're not understood, and feel that no one is there for them. And that's the most terrifying thing to me."

She wonders why NEDA can't have both: a 24/7 chatbot to pre-screen users and reroute them to a crisis hotline if needed, and a human-run helpline to offer connection and resources. "My question became, why are we getting rid of something that is so helpful?"

A chatbot designed to help treat eating disorders

Tessa the chatbot was created to help a specific cohort: people with eating disorders who never receive treatment.

Only 20% of people with eating disorders get formal help, according to Ellen Fitzsimmons-Craft, a psychologist and professor at Washington University School of Medicine in St. Louis. Her team created Tessa after receiving funding from NEDA in 2018, with the goal of looking for ways technology could help fill the treatment gap.

"Unfortunately, most mental health providers receive no training in eating disorders," Fitzsimmons-Craft says. Her team's ultimate goal is to provide free, accessible, evidence-based treatment tools that leverage the power and reach of technology.

But no one intends Tessa to be a universal fix, she says. "I don't think it's an open-ended tool for you to talk to, and feel like you're just going to have access to kind of a listening ear, maybe like the helpline was. It's really a tool in its current form that's going to help you learn and use some strategies to address your disordered eating and your body image."

Tessa is a "rule-based" chatbot, meaning she's programmed with a limited set of possible responses. She is not chatGPT, and cannot generate unique answers in response to specific queries. "So she can't go off the rails, so to speak," Fitzsimmons-Craft says.

In its current form, Tessa can guide users through an interactive, weeks-long course about body positivity, based on cognitive behavioral therapy tools. Additional content about binging, weight concerns, and regular eating are also being developed but are not yet available for users.

There's evidence the concept can help. Fitzsimmons-Craft's team did a small study that found college students who interacted with Tessa had significantly greater reductions in "weight/shape concerns" compared to a control group at both 3- and 6-month follow-ups.

But even the best-intentioned technology may carry risks. Fitzsimmons-Craft's team published a different study looking at ways the chatbot "unexpectedly reinforced harmful behaviors at times." For example, the chatbot would give users a prompt: "Please take a moment to write about when you felt best about your body?"

Some of the responses included: "When I was underweight and could see my bones." "I feel best about my body when I ignore it and don't think about it at all."

The chatbot's response seemed to ignore the troubling aspects of such responses — and even to affirm negative thinking — when it would reply: "It is awesome that you can recognize a moment when you felt confident in your skin, let's keep working on making you feel this good more often."

Researchers were able to troubleshoot some of those issues. But the chatbot still missed red flags, the study found, like when it asked: "What is a small healthy eating habit goal you would like to set up before you start your next conversation?'"

One user replied, "'Don't eat.'"

"'Take a moment to pat yourself on the back for doing this hard work, <<USER>>!'" the chatbot responded.

The study described the chatbot's capabilities as something that could be improved over time, with more inputs and tweaks: "With many more responses, it would be possible to train the AI to identify and respond better to problematic responses."

MIT professor Marzyeh Ghassemi has seen issues like this crop up in her own research developing machine learning to improve health.

Large language models and chatbots are inevitably going to make mistakes, but "sometimes they tend to be wrong more often for certain groups, like women and minorities," she says.

If people receive bad advice or instructions from a bot, "people sometimes have a difficulty not listening to it," Ghassemi adds. "I think it sets you up for this really negative outcome...especially for a mental health crisis situation, where people may be at a point where they're not thinking with absolute clarity. It's very important that the information that you give them is correct and is helpful to them."

And if the value of the live helpline was the ability to connect with a real person who deeply understands eating disorders, Ghassemi says a chatbot can't do that.

"If people are experiencing a majority of the positive impact of these interactions because the person on the other side understands fundamentally the experience they're going through, and what a struggle it's been, I struggle to understand how a chatbot could be part of that."

Copyright 2023 Michigan Radio

Tags
Kate Wells is a Peabody Award-winning journalist and co-host of the Michigan Radio and NPR podcast Believed. The series was widely ranked among the best of the year, drawing millions of downloads and numerous awards. She and co-host Lindsey Smith received the prestigious Livingston Award for Young Journalists. Judges described their work as "a haunting and multifaceted account of U.S.A. Gymnastics doctor Larry Nassar’s belated arrest and an intimate look at how an army of women – a detective, a prosecutor and survivors – brought down the serial sex offender."
More Stories