Hasan Almasi / UnsplashFurther extending its efforts to create only positive experiences on the platform, Facebook recently announced that it is now removing posts that provoke violence. In a statement shared by The Verge, Facebook says, “There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down. We will begin implementing the policy during the coming months.”
With the new policy in place, Facebook, in partnership with threat intelligence agencies, will review all posts containing inaccuracies or misinformation made to cause violence or physical harm. Both text and images are included in this review—even if the text of a post doesn’t provoke violence, an attached image that does so is cause for removal.
Additionally, Facebook will determine whether a post should be removed based on press reports and its own public policy employees. Once a post is determined to be false, Facebook will remove the post. Removal includes any duplicate or shared versions of the post across the platform.
Facebook announced the new policy this week, but according to The Verge, it was put into effect last month.
The company is working harder than ever to make it a safe space for users; However, outside factors aren’t making it easy. In September 2017, for instance, Facebook had to remove self-reported ad targeting after advertisers had used the tool to target “Jew haters.” With every punch, however, Facebook follows with a strong, defensive swing. Hopefully, this latest swing will effectively prevent violence caused through the platform.
Facebook’s Community Standards Bans Violent and Criminal Behavior
To prevent offline harm caused by content posted to Facebook, the social media company takes action against accounts that violate its Community Standards, which contains four clauses addressing violence, dangerous groups and organizations, crime, and coordinated harm.
Violence and Incitement
As stated in its Community Standards, Facebook removes content that incites violence. The accounts that post such content will be disabled, barring access from the social media platform. Content Facebook deems to be inciting violence includes the following:
- Posts stating the intent to commit violence
- Posts encouraging others to commit violent crimes
- Posts advocating for violent crimes
- Posts that solicit the services of hitmen, mercenaries, or assassins
- Posts that instruct users how to create weapons or explosives
- Posts with imagery that suggests violence against an individual, for example, a silhouette of a gun pointed at a person.
Of course, Facebook recognizes that some posts or comments with language that incites violence are spoken in jest and should therefore not be taken seriously. To distinguish credible threats from casual statements, Facebook analyzes the language and context of the post. The social media company also considers the original poster’s public visibility and vulnerability.
Dangerous Individuals and Organizations
In addition to curbing posts that incite offline violence, Facebook also bans groups, organizations, and individuals that proclaim a violent or criminal mission. Facebook determines a group, organization, and individual to be a threat if they participate in terrorism, organized hate, mass or serial murder, human trafficking, and organized crime or violence. Posts that promote such activities will be deleted, and the accounts disabled.
In the past users have publicized their crimes, posting the live video footage on Facebook. Doing so violates Facebook’s policy against promoting crimes such as murder, theft, or fraud. “We do not condone this activity. There is a risk of copycat behavior,” says Facebook’s Community Standards on the matter. Although Facebook does delete such content and punish the offending account, it does allow users to post content depicting violent or criminal activity as long as its purpose is to spark a debate, educate, or to make a satirical comment on an issue.
Facebook lists the following activity as criminal and should therefore be deleted from the platform:
- Physical harm against individuals
- Physical harm against animals
- Poaching or selling endangered species
- Animal fights
- Sexual violence
Because Facebook doesn’t want people to use its platform as a tool to organize offline harm, the social media company shuts down any online activity meant to harm others in the real world. For example, a Facebook group created to organize a riot will be deleted and the perpetrators reported to the police. In addition to physical harm against human and animals, Facebook also bans activity meant to organize and execute swatting, theft, vandalism, voter suppression, trafficking, sexual assault, and arranged marriages.
By Anna Hubbel