A study of TikTok’s video recommendation algorithm found that it recommended content about eating disorders and self-harm to some new teen accounts within minutes.
Research by the Center to Combat Digital Hate (CCDH) found that one account showed suicide content within 2.6 minutes, while another account showed eating disorder content within 8 minutes.
Further investigation by Sky News also found evidence of harmful eating disorder content being recommended through TikTok’s suggested search feature, despite not searching for explicitly harmful content.
British eating disorder charity BEAT said the findings were “extremely shocking” and called on TikTok to take “urgent action to protect vulnerable users”.
Content Warning: This article contains references to eating disorders and self-harm
TikTok’s For You page offers a collection of videos recommended to users based on the type of content they engage with on the app.
The social media company said recommendations are based on a variety of factors, including video likes, follows, shares and device settings such as language preference.
But some have raised concerns about the way the algorithm behaves when it recommends harmful content.
CCDH opened two new accounts in the UK, US, Canada and Australia. For each, a traditional female username is given and the age is set to 13.
A second account in each country also contained the phrase “lose weight” in its username, a trait that separate research has shown accounts belonging to vulnerable users exhibit.
CCDH researchers analyzed the video content displayed on each new account’s For You page over a 30-minute period, interacting only with videos related to body image and mental health.
It found that the criteria for users was to provide users with videos related to mental health and body image every 39 seconds.
Not all content recommended at this rate is harmful, and the study did not distinguish between positive and negative content.
However, it found that all users received eating disorder content and suicide content, sometimes at a rapid rate.
The CCDH research also found that vulnerable accounts displayed three times as much such content as standard accounts, and that these accounts displayed more extreme content than standard accounts.
TikTok is host to an eating disorder content community that has amassed more than 13.2 billion views across 56 different hashtags, according to CCDH findings.
About 59.9 million of those views were on hashtags that contain a high concentration of videos in support of eating disorders.
However, TikTok said the activity and resulting experiences captured in the study “did not reflect the behavior of real people or the real viewing experience.”
Kelly Macarthur has had an eating disorder since she was 14 years old. She has now recovered, but as a content creator on TikTok, she worries that some of its content may have an impact on those who are suffering.
“When I’m not feeling well, I think social media is a really healthy place where I can vent about my issues. But in reality, it’s full of anorexic material that gives you different cues and triggers,” she says told Sky News.
“I’m seeing the same thing happen to young people on TikTok.”
Further investigation by Sky News also found that TikTok suggested harmful eating disorder content in other areas of the app, despite not explicitly searching for it.
Sky News conducted its own research into TikTok’s recommendation algorithm using several different accounts. But instead of analyzing the For You page, we searched TikTok’s search bar for innocuous terms like “weight loss” and “diet.”
Searching for the term “diet” on one account turned up another suggestion, “pr0 a4a”.
This is the “pro ana” code associated with pro-anorexic content.
TikTok’s community guidelines prohibit content related to eating disorders on its platform, and this includes prohibiting searches for terms explicitly related to it.
But users often tinker with terminology slightly, meaning they can continue to post about certain issues without being spotted by TikTok moderators.
While TikTok has banned the term “pro ana,” variations of it still pop up.
Sky News also found eating disorder content was easily accessible through TikTok’s user search function, although it was not explicitly searched.
A search for the term “weight loss” returns at least one account among its top 10 results that appears to be an eating disorder account.
Sky News reported the story to TikTok, which has since been deleted.
“It’s shocking that TikTok’s algorithm is actively pushing users towards damaging videos that can have devastating effects on vulnerable groups,” said Tom Quinn, director of external affairs at BEAT.
“TikTok and other social media platforms must act urgently to protect vulnerable users from harmful content.”
In response to the findings, a TikTok spokesperson said: “We regularly consult with health professionals to eliminate violations of our policies and provide support resources to anyone who needs them.
“We’re aware that trigger content is unique to each individual and continue to focus on creating a safe and comfortable space for everyone, including those who choose to share their recovery journey or educate others on these important topics .”
The Data and Forensics team is a multi-skilled unit dedicated to delivering transparent news coverage from Sky News. We collect, analyze and visualize data to tell data-driven stories. We combine traditional reporting techniques with advanced analysis of satellite imagery, social media and other open source information. Through multimedia storytelling, we aim to better explain the world while showing how our journalism is done.
Why data journalism matters to Sky News