The secret rules of Facebook censorship are now public. And it can also be appealed

For the first time, Facebook has published detailed rules according to which defective items can be removed. The new introduced the possibility of a quick appeal against the removal of a post, which Facebook evaluated as flawed.

The conversations that take place on Facebook are a mirror image of the diversity of a community of more than two billion people. People communicate with each other regardless of geographical and cultural boundaries and in all languages, following the rules published today by Facebook.


A page with an overview of the rules for removing pspvk from Facebook.

For the first time, the social team published a number of rules that its employees follow when removing so-called dangerous or harmful content. In the past, users of the platform’s most natural communication only had general information about the limits for publishing problematic content, such as drugs, hate speech, or sexual violence. Now they have access to the company’s detailed internal rules on the Facebook website.

The rules for judging are not secret

On Facebook, you should know where we lead the red hand, which is OK, said the company’s vice president Monika Bickertov. Society is often criticized by foreign governments and human rights organizations for failing to prevent the manifestation of hatred and unequivocal misuse of terrorists and perpetrators of sectarian violence. On the other hand, another user criticizes the fact that Facebook is more puritnical about the naked press or that it does not take into account the context when it comes to news and historical articles. Mark Zuckerberg, of Facebook, again criticized the US Congress for how Facebook is suppressing conservative views in the United States (more in our article).


First of all (quotes from Facebook Community Rules)

In the first place, Facebook elites want transparency and openness: We publish detailed internal regulations on how Facebook assesses content, said Siobhan Cummiskey, editor of the Public Policy Manager EMEA, to We want to emphasize that Facebook is not a place for harmful content, and we want to show people how we decide what to do with Facebook and what not. We know that this stimulates a debate about these community rules and how to improve them.

Facebook community rules describe individual types of content in detail. Each category has a detailed comment. Forbidden content includes, for example, obfuscation and harassment, information obtained from hackers, drug distribution, sale of firearms or suicide guides.

Rules for content on Facebook (examples)
Remove Keep
Nsil credible threats of force, credible calls for force, instructions for the production of weapons and weapons, recruitment to terrorist organizations, depiction and promotion of crime, coordination of force flow information about the violent flow, general calls for violence, information about the terrorist organization, information about the crime, satire, debate about the crime
Zakzan corn sale of illegal drugs, instructions for the production of illegal drugs, sale of marijuana, personal sale of firearms and ammunition alcohol, tobacco, official shop selling firearms (from 21 years)
Nahota a sex Children’s nudity, nudity without consent, sexual services, sexual acts, exposed genitals, exposed female adra Nudity as a form of political protest, nudity in the context of illness, breastfeeding, photographs of works of art depicting nakedness
ikana ikanovn, vhrky, stalking Alert or debate about ikan
Hate speech flow to people on the basis of the so-called protective features, ie race, ethnicity, nationality, religious beliefs, sexual orientation, gender, gender identity or external physical disability and disease, calls for exclusion and segregation criticism of violence, communication of a hate speech with a duty to draw attention to it, satire and irony, social comment
Spam Commercial spam, false advertising, calls for likes Advertising, commercial information
Authentic communication Communication under a false identity, pretending to be someone else Parody, pseudonymous
Author’s first Content infringing copyright Own work, communicated to others with written consent, links
Not true at first (Facebook does not remove, but under uritch circumstance limits their reach) False at first, falen at first, parody at first, opinion

Uivatel may be able to

In addition to describing the misinformation, the new measure first allows users to protest against the removal of the content and ask for a review. Until now, this option only applied to the liquidation of entire bodies. If the photo, video and specific contribution are taken, a button with enough research will appear on the user’s profile. If the user under review, Facebook will give the content a test again, within 24 hours. In the event of an error, the removed content will then be restored and the publisher will be notified.

Bickert’s Reuters agency said that the company’s rules for the collection of harmful content are routinely changed, among other things, thanks to the recommendations of more than hundreds of expert organizations in the field of counter-terrorism, misappropriation and others. Discuss the changes to the original Facebook office every two weeks, said Bickertov.

According to the AP agency, Cenzoi Facebook works in thirty languages, the problematic content to which they are notified should usually be removed within 24 hours, even if the company does not have a formal time limit.

When it gets tough, Facebook communicates directly with the police

There are cases where Facebook not only removes content, but also takes action. One of these is life threatening and doubting, for example, when someone on Facebook publishes a double-tracked cry of other people, or when someone wins by suicide. When deciding whether a threat is plausible, we can also consider the added information, such as public visibility and vulnerability of the person. If there is a physical or public security threat, we will remove the content from our own listed content, block it, or contact the security forces that stand in the community policy.

We asked Siobhan Cummiskey what such communication with security forces looks like in practice. We were interested in how Facebook considers information in such a case. If we find information about a two-sided threat, bad for the specific situation. We take into account the position of the hunter, we will carry out a detailed internal investigation and if we come to the conclusion that the situation is outside, we will use all the information we have. Security is an absolute priority.

If we come to the conclusion that life is in danger, we communicate with the local police, Cummiskey continued. We have regulations regarding what we communicate and under what circumstances. When it comes to life, let’s proactively tell the police about what will help me save my life. The police will give Facebook information about information in other cases, so much and go through the approval process.

Update: We added quotes to the article and added an overview of Facebook community rules. We have added links to related information.