Purpose now email book
Purpose now email book

Can we send you a free gift?

We'd love to send you your own copy of Purpose: Design a Community
& Change Your Life.

It’s a Wall Street Journal best seller that offers a proven path to translating your purpose into communities that need to exist in the world.

Success

Keep your eye on your inbox for how to redeem your free book.

Something went wrong!

Please, try again later.

Mighty Encyclopedia

Content Moderation

What is content moderation?


Content moderation is the process of evaluating user-generated content (UGC) to make sure it’s appropriate to the platform and not harmful to other users. Content moderation should follow universal expectations of human decency to make sure platforms are free from hate, bullying, and discrimination. But it can also ensure that the content being posted fits the specific rules or community guidelines of an online space so that posts are relevant to users and also are not spam.


Content moderation can be done by the users (e.g. through “reporting” inappropriate content) as well as the owners of a platform. Many large social media platforms also hire independent content moderators or community managers in an effort to make platforms friendly.


Good content moderation requires clear penalties for violating content rules, whether it’s the removal of the content, a warning to the user, removal from the platform, or a combination of these.


Why is content moderation important?


Allowing users to generate content on a platform is a beautiful thing, leading to healthy exchanges and deep connections. But the negative side of user-generated content can mean the dark things mentioned above: bullying, discrimination, and harassment.


Good content moderation fundamentally allows users to feel safe. If people feel safe they will engage and even be willing to be vulnerable. And when people are vulnerable, it creates positive and long-lasting community connections.


Now Read: 5 Essential Tips for Improving Member Engagement