Have you ever wondered why Facebook removed a post from its platform? And how the company decides what content should be removed or kept? Well look no further – Facebook has just released its Community Standards that the company’s content reviewers use to make these tough decisions.
In the past, only Facebook’s content reviewers had access to this document. The move to allow the public to finally see these once internal guidelines comes after The Guardian obtained and posted snippets of the company’s extensive and at times contradictory rules last year.
Facebook’s states that its current Standards apply around the world to all types of content and are designed to be comprehensive. The purpose of its Community Standards is to encourage expression and create a safe environment for everyone on the platform. Facebook has put together these policies based on input received from its users and experts in the technology and public safety fields.
The company has broken down its policies/standards into the following 6 categories:
In addition to releasing its Community Standards, Facebook is now giving its users the ability to appeal its decisions on individual posts. Basically, this allows users to ask for a second opinion if Facebook has removed one of their posts and they believe the company made a mistake in doing so.
As of right now, users can only ask for an appeal for posts removed for nudity/sexual activity, hate speech, or graphic violence. If Facebook removes a post for one of those reasons, the user will receive a notification about the action and give the user the option to request an additional review of the post. Within 24 hours of initiating an appeal, users should know whether Facebook plans to restore the post, or keep it off the platform permanently.
If you have any questions about how Facebook’s Community Standards may affect your business, please reach out to the Social Media experts at MoreVisibility.