Moderation and Child Safety Online: Video and Image Filtering

June 22, 2017 | 

The internet has been a great form of entertainment for all ages. YouTube and Facebook alone has given us a vast selection for us to go through to keep us entertained. Some parents even allow their young children to have access to these sites to keep their kids occupied, watching ‘G’ rated cartoons on YouTube for a few minutes while they attend to household chores. Some older underage children can even access Facebook or are even allowed to make an account, scrolling through photos and adding friends on Facebook.

Moderation and Child Safety Online: Video and Image Filtering

However, despite the vast efforts of many companies to protect content, the internet is not always that friendly and safe. Even YouTube and Facebook are full of fake videos, photos, and content that is not suitable for younger viewers. Even with advanced browser parental control features, we cannot be sure that platforms remain child-friendly.

An instance of this kind of video slipping through the video filter is a parody of British animated series Peppa Pig which was titled “#Peppa #Pig #Dentist #Animation #Fantasy.” The video shows Peppa going to the dentist, who has a giant needle and a lot of scary tools. The pigs are mysteriously green rather than pink. Though, this is not the only innocent cartoon that has been recreated into an inappropriate parody. Characters like Thomas the Tank Engine, Doc McStuffins, and Elsa from Frozen are just a few of the cartoon characters shown in inappropriate parodies.

The BBC said it used the Facebook report button to alert the company on 100 images that appeared to break its guidelines against obscene and sexually suggestive content. This includes pages that were explicitly for men with a sexual interest in children. Among the 100 images, only 18 were removed by Facebook. The organization also found profiles of five convicted pedophiles and has reported them to Facebook’s system, but none of the five profiles have been taken down despite Facebook rules forbidding convicted sexual offenders to have accounts. Though you cannot deprive children from online entertainment and education, you cannot also fully rely on the algorithms behind the child safety filter option. But with the use of content moderation, to filtering out inappropriate videos and images, the internet will be safer for children.

Algorithms and automated systems are not the only forces behind the filtering of videos and images. User cooperation can be another form of moderation, where users flag content they see to be inappropriate and to be viewed by company moderators. Outsource Moderation plays a significant role in filtering images and videos. Even if a company has its in-house team of content moderators, they are still no match for the speed and amount of content posted on various social media platforms and forums. To prevent radical and graphic content from spreading and ending up into the sights of the underaged audience, companies have taken advantage of outsourcing content moderation. Content moderators and purpose built AI’s, can go through millions of posts a day, manually or automatically deleting obscene, gore, violence, and terrorist propaganda from the feeds of company social media platforms and websites. Manual Moderators are specifically specialised to filter anything that Automation may not be able to filter such as judgment call scenarios, and can also be used to teach AI’s.

Outsourcing companies who specialize in content moderation contribute to maintaining the brand's website reputation and credibility towards their users. Companies have relied on outsourced content moderators to make sure that the sites are safe, and filled with content suitable for their audience, especially where most of the audiences are children or young teenagers.