Purpose now email book
Purpose now email book

Watch Now

No hoops to jump through—you’ll immediately get access to the entire masterclass for free.

By signing up you’re opting in to emails from us about the free masterclass, the podcast, and other helpful guides and goodies. You can unsubscribe at any time.

You’re in!

👇👇👇

Go to the Masterclass

The link is also in your inbox

Something went wrong!

Please, try again later.

Mighty Encyclopedia

Content Moderation

What is content moderation?


Content moderation is the process of evaluating user-generated content (UGC) to make sure it’s appropriate to the platform and not harmful to other users. Content moderation should follow universal expectations of human decency to make sure platforms are free from hate, bullying, and discrimination. But it can also ensure that the content being posted fits the specific rules or community guidelines of an online space so that posts are relevant to users and also are not spam.


Content moderation can be done by the users (e.g. through “reporting” inappropriate content) as well as the owners of a platform. Many large social media platforms also hire independent content moderators or community managers in an effort to make platforms friendly.


Good content moderation requires clear penalties for violating content rules, whether it’s the removal of the content, a warning to the user, removal from the platform, or a combination of these.


Why is content moderation important?


Allowing users to generate content on a platform is a beautiful thing, leading to healthy exchanges and deep connections. But the negative side of user-generated content can mean the dark things mentioned above: bullying, discrimination, and harassment.


Good content moderation fundamentally allows users to feel safe. If people feel safe they will engage and even be willing to be vulnerable. And when people are vulnerable, it creates positive and long-lasting community connections.


Now Read: 5 Essential Tips for Improving Member Engagement