Facebook cracks down on bad experiences across all its social platforms. For example, last month, the company announced its Data Abuse Bounty Program, which rewards users who report data misuse to help Facebook identify app developers abusing its platform. Back in the fall, Facebook stopped allowing self-reported targeting after some advertisers used it to target “Jew haters.” And recently, the company informed advertisers that it’s expanding its efforts to reduce clickbait and deceptive links to low-quality websites. Now, Facebook introduces new tools to its app, Messenger, that make it easier for users to report conversations that violate the network’s Community Standards.
Previously, users could only report Messenger conversations through Facebook reporting tools or through Messenger web—there was nothing for mobile. Now, Facebook says users can report conversations straight through their iOS or Android mobile devices.
Reporting Messenger Conversations
Users can report a conversation they suspect violates Facebook’s Community Standards by doing following:
Step 1: Open Messenger and go to the conversation in question.
Step 2: Tap the name of the person or group involved in the conversation.
Step 3: Scroll to “Something’s Wrong” in the list of options that appear.
Step 4: Choose the category that best fits the conversation, such as harassment, hate speech, or pretending to be someone else.
Facebook says users also have the option to ignore or block the person or group being reported. The network will send a confirmation message to indicate the report was successfully submitted.
“Providing more granular reporting options in Messenger makes it faster and easier to report things for our Community Operations team to review,” Hadi Michel, product manager for Facebook Messenger, says in a news post. Michel adds that the team reviews reports in over 50 languages. “This means our community will see issues addressed faster so they can continue to have positive experiences on Messenger.”
Although Facebook is comprised of a massive network of employees that review content across the globe, the company still can’t do it all on its own. It needs its users to help make its social platforms safe and enjoyable for all.