The video-streaming platform says that repeated exposure to certain types of videos could hurt teenager’s self-esteem and body image.
European teenagers may notice a change in their YouTube recommendations, after the video platform said it will stop ushering them toward some types of health and fitness videos, particularly those that “idealise” certain body types.
YouTube – which is among the most popular social media apps for teens – recommends videos that are similar to those the viewer has watched previously.
That means people can fall into feedback loops, watching many similar videos in a row and at times delving into more extreme content.
The new rule is an effort to prevent teens from forming “negative beliefs about themselves,” Dr Garth Graham, who heads YouTube Health, and James Beser, YouTube Youth’s director of product management, said in a statement.
So what does this mean and what kind of content will now be restricted?
Which videos will YouTube restrict for teenagers?
YouTube said it will now limit repeated recommendations of videos that:
- Idealise particular fitness levels or weight groups
- Compare and idealise certain physical features, or
- Are socially aggressive, meaning they show intimidation or fighting.
These types of content “may be innocuous as a single video, but could be problematic for some teens if viewed repetitively,” Graham and Beser said.
How does social media affect body image?
Social media can lead to poor body image, eating disorders, and mental health issues, according to a major review of 50 studies from 17 countries published last year.
That’s because people tend to compare themselves to others they see online, internalise a thin or fit standard as the ideal body type, and engage in self-objectification.
That doesn’t mean everyone is affected equally.
Women and girls, overweight people, and those who already have poor body image tend to be most affected by social media, while people who feel OK about their bodies and have high social media literacy are less affected – a dynamic that researchers call a “self-perpetuating cycle of risk”.
Meanwhile, a 2021 study found that fitness YouTubers – which has been dubbed the “Fitspiration” community – promote unhealthy behaviours, and that viewers reinforce those practices in the comments.
What other steps has YouTube taken?
YouTube already restricts teenagers’ access to some content involving eating disorders and physical fights.
With the new policy, YouTube can also redirect people to crisis hotlines when they search for things related to suicide, self-harm, and eating disorders.
The company also said it worked with organisations in Germany and France to craft the update.
What are regulators doing about the problem?
YouTube and other social media sites have come under fire for their effect on young people’s mental health and well-being, and some governments have threatened to crack down.
In the United Kingdom, for example, communications regulator Ofcom ordered tech companies in May to take steps to stop their algorithms from “recommending harmful content to children,” including content on self-harm and eating disorders.
The European Union’s Digital Services Act, which was adopted in 2022, also calls on tech giants to limit children’s access to content that could harm their “health, physical, mental and moral development”.