YouTube will limit teens’ exposure to videos that promote and idealize a certain level of fitness or physical appearance, the company announced Thursday. The safeguard was first implemented in the US last year and is now being rolled out to teenagers globally.
YouTube to limit teens’ exposure to fitness and weight videos PHOTO Archive
The announcement comes as YouTube has faced criticism in recent years for potentially harming teenagers and exposing them to content that could encourage eating disorders, reports techcrunch.com.
The type of content YouTube will limit exposure to includes videos that compare physical characteristics and idealize certain fitness levels, body types and weights. Separately, YouTube will also limit exposure to videos that feature “social aggression” in the form of non-contact fighting and intimidation.
The Google-owned platform notes that this type of content may not be as harmful as a single video, but if the content is repeatedly shown to teenagers, then it could become problematic. To combat this, YouTube will limit repeat recommendations of videos related to these topics.
Because YouTube’s recommendations are driven by what users tend to watch and engage with, the company needs to put these safeguards in place to protect teens from being repeatedly exposed to that content, even if it follows YouTube’s guidelines .
“As an adolescent develops thoughts about who he is and his own standards for himself, repeated consumption of content with idealized standards that begin to form an unrealistic internal standard could lead to the formation of negative self-beliefs,” said YouTube’s Global Head of Health, Dr. Garth Graham.
Thursday’s announcement comes a day after YouTube introduced a new tool that allows parents to link their accounts to their teen’s account to gain access to information about the teen’s activity on the platform. After a parent links their account to their teen’s, they will be alerted to their teen’s channel activity, such as the number of uploads and subscriptions they have.
The tool builds on YouTube’s current parental controls, which allow parents to test supervised accounts with children under the age of consent for online services, which is 13 in the US. It is worth noting that other social apps, such as TikTok, Snapchat, Instagram and Facebook, also offer supervised accounts linked to the parents of young users.