YouTube restricts ads on certain videos featuring minors

Spread the love

YouTube is adjusting the algorithm so that users are less likely to find videos that others abuse to exchange child porn tips in the comments. The site also restricts ads on certain videos.

Memo from YouTube, click to enlarge

In addition, software will be introduced to find comments on videos that indicate that users exchange tips for child pornography, parent company Google reports in a memo that AdWeek has put online. The site has sent the memo to major advertisers to reassure them after the outcry that sparked this week.

The outcry came after a video by MattsWhatItIs on Sunday, which described the problem as a “circle of softcore pedophilia” in comments on YouTube. In response, major advertisers withdrew their ads from the video site. Disney and game maker Epic Games did that, among others. According to an anonymous source, it would be a small part of the turnover on YouTube, but the Google subsidiary has nevertheless announced measures. It also deleted 4.3 million videos and 3.7 million comments in the past few days.

It’s not the first time YouTube has taken action after outcry over videos featuring minors. In 2017, for example, the site removed the ads from millions of videos and adjusted the algorithm to ensure that children were less likely to end up with child-unfriendly videos when searching for characters from cartoons such as Peppa Pig and Thomas de Trein.

You might also like
Exit mobile version