Instagram amplifies pro-eating disorder content ‘bubble’: report
Instagram amplifies pro-eating disorder content to users in a way that fosters a harmful interconnected community that includes teen and underaged users, according to a report released Thursday by advocacy group Fairplay.
Instagram’s algorithm and data profiling tactics create a pro-eating disorder “bubble” on the platform that includes more than 88,000 unique accounts and reaches 20 million unique followers, according to the report.
“[Instagram] collects all of the data that it needs to sort of profile you as someone who’s interested in sort of pro-eating disorder content, and then creates this sort of world around you where it recommends that you follow these people, recommends that these people follow you, it recommends this content, it fills your feed with this sort of content — and that actually you can almost become trapped inside sort of pro-eating disorder bubble, even if you didn’t intend on doing that,” said Rys Farthing, author of the report.
To analyze that purported bubble, researchers selected 153 “seed” accounts that were public and had over 1,000 followers each.
The chosen seed accounts posted visual content that celebrated “thinspiration” or “bonespiration” through images and other eating disorder memes, had an underweight body mass index indicated in their account bio or contained “eating disorder community-relevant vocabulary” such as “ed,” “ana” (referring to anorexia) or “mia” (referring to bulimia), per the report.
An analysis of the followers of the accounts found roughly 1.6 million unique followers. Researchers identified 88,655 accounts among them that followed three or more of the popular seed accounts. That group comprised the purported bubble on which they focused their research.
The account bios for users indicated a young user base in the bubble, with users displaying their age alongside a list of their “goal weights,” indicated by “gw.” Users within the bubble had a median age of 18, according to the report.
Researchers identified 21 users in the bubble who stated they are under 13 years of age, including users as young as 9 years old, according to the report. Instagram’s rules require users to be at least 13 to be on the platform.
“There’s a lot of particularly young people in that bubble who I don’t think want to be there, but the algorithm has sort of created this world around them and there they find themselves,” Farthing said.
But Farthing said the prevalence of the content and the recommendations to users is not accidental on Instagram’s part — it’s built into the business model.
The report estimates that Meta, the parent company of Instagram and Facebook, derives $2 million in revenue a year from the bubble, and $227.9 million from all those who follow it.
To analyze the estimated revenue, researchers used the Average Revenue Per Person metric Meta releases quarterly. The report specifically uses the Average Revenue Per Person released for Facebook to calculate the amount, since the company does not specifically release the metric for Instagram itself.
The report comes as Meta faces mounting scrutiny over the impact of its services on body image for young users after internal Facebook research documents were released by whistleblower Frances Haugen. Facebook has said the research was mischaracterized in reports about the documents.
A spokesperson for Meta said, “We’re not able to fully address this report because the authors declined to share it with us, but reports like this often misunderstand that completely removing content related to peoples’ journeys with or recovery from eating disorders can exacerbate difficult moments and cut people off from community.”
“Experts and safety organizations have told us it’s important to strike a balance and allow people to share their personal stories while removing any content that encourages or promotes eating disorders,” the spokesperson said.
The company also highlighted its policies and resources in place to help users struggling with body image issues or eating disorders, including a pop-up with tips and a way to connect with organizations like the National Eating Disorders Association (NEDA) and a dedicated reporting option for eating disorder content.
Examples of the content posted by users within the bubble included screenshots of calorie counting apps showing calorie intake ranging from 55 to 1,378 calories per day, as well as a post asking users if the “third day of eating only 300 cals hits different,” according to the report.
The bios in the accounts within the bubble included phrases such as “tw,” for trigger warning, “sw” for start weight, “cw” for current weight, and “Ana Coach,” for a person who “coaches you to lose more weight,” according to the report.
Content promoting extreme weight loss or glamorizing eating disorders is not unique to the platform, or to the social media age. But Instagram’s algorithm and amplification of the content makes it harder for users to escape once they’ve been brought into the bubble, Farthing said.
“I think back to when I was 13… and I was looking at those ads of Kate Moss. I could just turn my head away, there wasn’t this sort of $500 million algorithm chasing me around making sure that content returned, making sure that I was encouraged to connect and follow other people who are also interested in this content,” she said.
Eating disorder survivor turned activist Kelsey, a 17 year old high school student from Southern California, shared her story as part of the report. Kelsey said at the height of her eating disorder, she used social media as a “fuel” for her “obsession with weight loss.”
“I took the content they recommended to me of perfect toned bodies and tips for weight loss religiously, it motivated me when I was at my worst to continue down that destructive path of destroying my health. It was only when I learned to distance myself from social media could I then use my outside perspective to see just how horrible the impact was,” she said.
“But it was up to me to actively try and change my social media feeds, I had to do the hard work. This content was just always in my feed already, and somehow it was my responsibility to get it out.”
The report recommends lawmakers further regulate platforms as a way to mitigate concerns about algorithms sucking young, vulnerable users into “bubbles” such as the pro-eating disorder community described in the report. As an example for such regulation, Fairplay cites the U.K Age Appropriate Design Code, which calls for social media and other internet companies to change their design services and how they process data for minors.
A California Assembly committee is set next week to hold a hearing on a proposal for the California Age Appropriate Design Code bill, which seeks to adopt a code similar to the U.K.’s.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts