TikTok algorithm “bombards” users with self-harm content, according to new study

TikTok

TikTok‘s recommendation algorithm “bombards” teenagers with self-harm content and eating disorder content, according to a new study.

The Center for Countering Digital Hate (CCDH) found that the video-sharing site promoted content including dangerously restrictive diets, pro-self-harm content and content romanticising suicide to users who show a preference for the material, even if they are registered as under-18s, reports The Guardian.

According to the campaign group, as part of their study, they set up accounts in the US, UK, Canada and Australia, registered with ages of 13, the minimum age for joining the service.

The body created “standard” and “vulnerable” accounts, the latter containing the term “loseweight” in their usernames, which the CCDH said reflected research showing that social media users who seek out eating disorder content often choose usernames containing related language.

CREDIT: Getty Images

The accounts allegedly “paused briefly” on videos about body image, eating disorders and mental health and also liked them. This allegedly took place over a 30-minute initial period when the accounts launched, in an attempt to capture the effectiveness of TikTok’s algorithm that recommends content to users.

On “standard” accounts, content about suicide apparently appeared within nearly three minutes and eating disorder material was shown within eight minutes.

Imran Ahmed, CCDH’s chief executive told The Guardian: “The results are every parent’s nightmare. Young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health.”

The body went on to say the majority of mental health videos presented to its standard accounts via the For You feed – the main way TikTok users experience the app – consisted of users sharing their anxieties and insecurities.

Their study also said body image content was more harmful, with accounts registered for 13-year-olds being shown videos advertising weight loss drinks and “tummy-tuck” surgery.

One animation that appeared in front of the standard accounts also featured a piece of audio allegedly stating: “I’ve been starving myself for you” and clocked up more than 100,000 likes.

The research also found the accounts were shown self-harm or eating disorder videos every 206 seconds and that videos relating to body image, mental health and eating disorders were shown to “vulnerable” accounts three times more than to standard accounts.

The CCDH said its research did not differentiate between content with a positive intent, such as content discovering recovery, or negative content.

A spokesperson for TikTok told NME in response to the study: “This activity and resulting experience does not reflect genuine behaviour or viewing experiences of real people. We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.

“We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”

TikTok’s guidelines also ban content that promotes behaviour that could lead to suicide and self-harm, as well as material that promotes unhealthy eating behaviours or habits.

For help and advice on mental health:

The post TikTok algorithm “bombards” users with self-harm content, according to new study appeared first on NME.