google.com, pub-2344702122437023, DIRECT, f08c47fec0942fa0
top of page
Search

YouTube and Child Safety: Content Worm Holes Explained

Writer: Liz at the FlaxenLiz at the Flaxen

Big brands are again withdrawing their ad spend from the YouTube platform in response to a reported "softcore paedophilia ring". How exactly has the video platform been creating content "worm holes" centred on young girls?

The issue was highlighted in a recent video by YouTuber Matt Watson on his channel MattsWhatItIs but has (according to Matt) been a known issue for some time between YouTube users and across online forums. YouTube users with tendencies towards paedophilia are being delivered videos of little girls by the algorithm repeatedly and in quantity, and have been leaving inappropriate comments; even listing time codes of uncompromising shots for their pervy mates to find.


Simple Overview of The YouTube Algorithm

In order to explain why this could happen, we need a quick overview of the YouTube algorithm. YouTube don't share details of how their algorithm works, because if they did we'd all be gaming it and gaining huge views (and revenue) from our uploads. What we do know is that the algorithm is based on user behaviour and promotes videos to users that they are more likely to watch. It is designed to improve user experience on the platform so that individuals watch more videos for longer, stay on the platform and ultimately generate more ad revenue.


As an example, if I search for FIFA 19 on YouTube I am delivered a list of suggested videos for me to watch mainly on that topic. The top videos and channel presented to me are the ones that YouTube knows have been most popular with users who have used the same search term as me. These users have watched and engaged with these particular FIFA 19 videos, and their behaviour suggests that these particular videos are therefore perfect search returns for these keywords. If I click on the top search return and watch the video, I get a list of recommended videos in the right pane. Some of these videos have also been popular with people who searched for FIFA 19 and watched this video. The rest of the suggested videos are based on my personal viewing history. I have been watching a lot of make up tutorials recently so my suggested videos are now a mix of FIFA, gaming and make up.


Why all the videos of little girls?

The issue of the "worm hole" that Matt Watson described is pretty simple to explain using the above logic. If I click on a video of a little girl, YouTube's algorithm is going to push me a recommended video that is based on what other people who used the same search term and watched that video were most likely to click to watch next. And the more of these types of videos I click on and watch, the more YouTube thinks I like them. So then the algorithm pushes me even more similar videos. Unfortunately it becomes pretty unsavoury pretty fast.


That YouTube didn't detect the videos as harmful is not altogether surprising. The content in the videos is mostly fairly innocent with young girls filming themselves doing normal, every day little girl stuff. The shocking part is not necessarily the video content itself, but how platform users are watching and interacting with it. The only analogy I can think of for the behaviour of these users is taking a family photo album and using it as porn to get your kicks. However, it is worse than this because of the broad and public availability of the videos on YouTube, and the fact that these predators can actually contact the little girls in the video just by leaving a comment.


Why wasn't this detected sooner?

The reason I think this feels so shocking is that we have an expectation that Google should have detected this and we feel betrayed that YouTube were unable to rapidly analyse, detect and shut this down. Why wasn't it easy for Google to automatically pick this up as unusual audience behaviour, perhaps using the number of comments with time codes as a big red flag? Surely it's someone's job to scour online forums for inappropriate posts that link back to their platform isn't it? But I'm not a Google engineer, and maybe this behaviour too closely mimicked that of 'normal' users on completely different and innocuous types of content. I would also like to think Google were busy picking up and shutting down other harmful content rings and audience segments before we were even aware they existed. I badly want this to be true!


What have YouTube and Google done about this?

This week YouTube have reported that ad-friendly videos can be demonetised if the comments are inappropriate which means your ad dollars are less likely to be associated with little girl videos going forward. A Google spokesperson has been quoted in the press as saying "Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube". "We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors."


What now?

In the short term ad revenue on the platform will be reduced as brands withdraw their ad spend to make a statement following the shocking headlines of the past couple of weeks. Video advertisers will have less inventory to buy and so could potentially miss out on reaching this highly engaged, diverse and vibrant audience. This doesn't have to be the case, as it is perfectly possible to place ads on very controlled environments through YouTube as long as you are prepared to pay the going rate. You should talk to your Google rep, ad networks and MCNs about contextual buying and channel targeting.


YouTube channel owners are likely to see a fall in CPMs across their content. Some may see popular videos demonetised that were previously ticking along just fine. Creators should keep an eye on this in their YouTube studio and be ready to hit the 'appeal' button once you've checked your comments look brand safe.


It is a shame that the platform and content creators will be adversely effected again by the actions of a twisted minority but it is a direct result of the platform being open to everyone to upload and everyone to watch. This democracy of content and audience is both a risk and what makes the YouTube platform great.


Links to recommended articles:




 
 
 

Komentarze


©2018 by The Flaxen. Proudly created with Wix.com

bottom of page