Share your email and we’ll send you our 6-part educational series, which kicks off with our groundbreaking framework for the “Life of a Community” that will help you look around the corner.
BONUS: We’ll also include thousands of dollars of bonuses and extras — absolutely free—including expert scripts and step-by-step guides!
Your first free resources will be on its way to your inbox soon.
We can’t wait to see what you create.
Something went wrong!
Please, try again later.
What is content moderation?
Content moderation is the process of evaluating user-generated content (UGC) to make sure it’s appropriate to the platform and not harmful to other users. Content moderation should follow universal expectations of human decency to make sure platforms are free from hate, bullying, and discrimination. But it can also ensure that the content being posted fits the specific rules or community guidelines of an online space so that posts are relevant to users and also are not spam.
Content moderation can be done by the users (e.g. through “reporting” inappropriate content) as well as the owners of a platform. Many large social media platforms also hire independent content moderators or community managers in an effort to make platforms friendly.
Good content moderation requires clear penalties for violating content rules, whether it’s the removal of the content, a warning to the user, removal from the platform, or a combination of these.
Why is content moderation important?
Allowing users to generate content on a platform is a beautiful thing, leading to healthy exchanges and deep connections. But the negative side of user-generated content can mean the dark things mentioned above: bullying, discrimination, and harassment.
Good content moderation fundamentally allows users to feel safe. If people feel safe they will engage and even be willing to be vulnerable. And when people are vulnerable, it creates positive and long-lasting community connections.