Can we send you a free gift?
This thought-provoking and actionable book offers a step-by-step guide to finding your purpose and translating it into action.
We’ll send you a link to redeem for a free copy once you share your email.
Keep your eye on your inbox for how to redeem your free book.
Something went wrong!
Please, try again later.
What is content moderation?
Content moderation is the process of evaluating user-generated content (UGC) to make sure it’s appropriate to the platform and not harmful to other users. Content moderation should follow universal expectations of human decency to make sure platforms are free from hate, bullying, and discrimination. But it can also ensure that the content being posted fits the specific rules or community guidelines of an online space so that posts are relevant to users and also are not spam.
Content moderation can be done by the users (e.g. through “reporting” inappropriate content) as well as the owners of a platform. Many large social media platforms also hire independent content moderators or community managers in an effort to make platforms friendly.
Good content moderation requires clear penalties for violating content rules, whether it’s the removal of the content, a warning to the user, removal from the platform, or a combination of these.
Why is content moderation important?
Allowing users to generate content on a platform is a beautiful thing, leading to healthy exchanges and deep connections. But the negative side of user-generated content can mean the dark things mentioned above: bullying, discrimination, and harassment.
Good content moderation fundamentally allows users to feel safe. If people feel safe they will engage and even be willing to be vulnerable. And when people are vulnerable, it creates positive and long-lasting community connections.