Moderation and Child Safety Online: Video and Image Filtering

June 22, 2017 | 

The internet has been a great form of entertainment for all ages. YouTube and Facebook alone have given a vast selection for us to go through to keep us entertained. Some parents even allow their young children to have access to these sites to keep their kids occupied, watching rated cartoons on YouTube for a few minutes while they attend to household chores. Some older underage children can even access Facebook or are also allowed to make an account, scrolling through photos and adding friends on different social media channels.

Moderation and Child Safety Online: Video and Image Filtering

However, despite the vast efforts of many social media moderation companies to protect their content, the dangers of social media are still there, and the internet is not always that friendly and safe. Even YouTube and Facebook are full of fake videos, photos, and content that are not suitable for younger viewers. Even with advanced browser parental control features, we cannot be sure that platforms remain child-friendly. Parents want to ensure that the dangers of social media do not inflict upon their children during their time being online.

An instance of this kind of video slipping through the video filter systems is a parody of British animated series Peppa Pig which was titled “#Peppa #Pig #Dentist #Animation #Fantasy.” The video shows Peppa going to the dentist, who has a giant needle and a lot of scary tools. The pigs are mysteriously green rather than pink. Though, this is not the only innocent cartoon that was recreated into an inappropriate parody. Characters like Thomas the Tank Engine, Doc McStuffins, and Elsa from Frozen are just a few of the cartoon characters shown in inappropriate parodies.

The BBC said it used the Facebook image moderation report button to alert the company to 100 images that appeared to break its guidelines against obscene and sexually suggestive content. This includes pages that were explicitly for men with a sexual interest in children. Among the 100 images, only 18 were removed by Facebook. The organization also found profiles of five convicted pedophiles and has reported them to Facebook’s system, but none of the five profiles have been taken down despite Facebook rules forbidding convicted sex offenders to have accounts. Though you cannot deprive children of online entertainment and education, you cannot also entirely rely on the algorithms behind the child safety filter option as the only social moderation method. But with the use of content moderation, especially image moderation and video moderation, the dangers of social media will be significantly reduced.

Algorithms and automated systems are not the only forces behind image moderation and video moderation. User cooperation can be another form of social moderation, where users flag content they see as inappropriate to be reviewed by social media moderation companies. Outsourced social moderation plays a significant role in filtering images and videos. Even if a company has an online content moderator, they are still no match for the speed and amount of content posted on various social media platforms and forums. To prevent radical and graphic content from spreading and ending up in the sight of the underage audience, social media moderation companies have taken advantage of outsourcing content moderation. An online content moderator, with the help of purpose-built AIs, can go through millions of posts a day, manually or automatically deleting obscenity, gore, violence, and terrorist propaganda from the feeds of company social media platforms and websites. Manual moderators are specifically specialized to filter anything that automation may not be able to filter (such as judgment call scenarios) and can also teach AIs.

Online moderation companies specializing in content moderation contribute to maintaining the brand website’s reputation and credibility towards its users. Online moderation companies have relied on outsourced online content moderator teams to make sure that the sites are safe, removing the dangers of social media and filling them with content suitable for their audience, especially when most of their audiences are children or teenagers.