Moderation and Child Safety Online: A Guide to Safe Chatting
The fast-moving nature of communication and the anonymity it allows are major reasons for the popularity of chat rooms. In a forum, they can be whoever they want to be, giving them the chance to experiment and express with less inhibition and more risk-taking than there would be in the real world. Although this can be great fun, liberating and healthy, it also has its negative sides of course.
There are obvious potential risks in communicating with people that you do not know and, unfortunately, some children have been hurt for meeting the ‘friends’ they have made online. Adults who are sexually interested in children are using chat rooms and other interactive areas online to make contact with and befriend children, persuading and manipulating their victims to meet up with them. There have also been many cases of children being bullied or threatened online in chat environments by other children.
You can use moderation to keep chat and public interactive services safe for children and provide a positive user experience by filtering or removing inappropriate, offensive and unsafe posts. Although a moderated service is not a guarantee of a child’s safety, it does provide an important service to vastly improve the chat environment. It is a good practice for chat environment providers to provide clear and prominent information about whether the service is moderated or not, and what kind of moderation is used.
There are different ways businesses can carry out moderation:
A computer software system automatically filters out words and phrases from the chat environment that it has been programmed to identify, such as swear words or personal details such as email addresses or phone numbers.
Human moderators review the content posted by a user and can, for instance, remove inappropriate posts or even bar a user from using a service. The process can be done in different ways, and these ways have implications as to the level of safety.
All materials are seen and checked before publishing. Though this has an implication on the speed of the communication, it is possibly the safest form of moderation and is perhaps well-suited for younger children.
The moderator reviews content after it is published/posted and will remove it if it clearly breaks the rules of the chat service (such as the posting of personal information).
The moderator reviews a sample of the material posted and will patrol a number of interactive spaces.
A moderator is on hand to respond when requested to do so by a user.
Technical moderation can provide a useful complement to human moderation in assisting with the removal of inappropriate posts. However, technical moderation has not shown that it can provide the same level of child protection as human moderation.
Keeping children safe online starts at home. With that said, parents can serve as the best moderators of their kids' activities online. Here are some of the things moms and dads can do to keep chatting as safe as possible for their youngsters:
Parents should have basic computer skills to be able to teach their children how to chat safely and even protect the whole family from computer viruses, fraud, and spam. You can also get pieces of advice on chat rooms and online communities designed for parents. Another thing is, you should familiarize yourself with the chat language your kids are using for you to understand their conversations with online friends.
As much as possible, do not let your children chat unaccompanied. But if you cannot always be there to watch over their online activities, show them good websites offering chat and assist them when registering and chatting the first few times. You should also monitor the amount of time they are spending online. Too much time spent at the computer may cause not just safety issues but also health problems such as RSI (repetitive strain injury), eye strain or simply lead to a lack of physical fitness.