Content Moderation

Content moderation refers to the process of monitoring and screening user-generated content based on a platform’s rules and guidelines. Moderators determine whether the submitted content is beneficial to the platform and its users, and therefore, whether or not it merits publication. There are two kinds of content moderation error: inadvertently permitting the publication of harmful material and incorrectly identifying legitimate content as harmful. These both negatively impact the platform — in the former case, by harming the user, and in the latter, by harming the well-intended content contributor. Effective content moderation requires reduction of both error types to as close to zero as possible.

Everise is an expert at moderating user-generated content, in ways that both protect your users and produce an outstanding experience for contributors. We understand how effective moderation is vital in preservation of positive experiences.

 

Related insights and studies: