Each social media platform has its own struggle. There’s Snapchat whose popularity, user base, and ad revenue cannot compete against rivals. There’s Twitter whose main predicament involves spam bots, and then there’s Instagram, where fake engagement abounds.
But no social media platform has experienced a year as tumultuous as Facebook. The problems of other social media companies appear miniscule next to the barrage of scandals it has faced. Since the proliferation of fake news and ads back in 2016, Facebook has been bombarded by one scandal after another—Russian interference, Cambridge Analytica, New Zealand shooting—in a seemingly endless cycle. Just when public ire abates, another controversy arises.
In the past two and a half years, Facebook has been improving its platform to better prevent abuse from bad actors. Recently the social media company announced a slew of new features and changes that curb misinformation and prohibited activities. These changes are coming to News Feed, Groups, Messenger, and Community Standards.
Yet Another Algorithm Change
Facebook’s algorithm has had a controversial history. In the early years of its existence, the algorithm was simple: posts that users published last would appear on top of the feed. That changed in 2016 when Facebook switched the algorithm from chronology to relevance, much to the chagrin of users. The company changed the algorithm yet again in 2018, prioritizing posts from friends and family over posts from businesses, brands, and publishers.
Facebook announced that it’s adding yet another tweak to the algorithm. This time, it’s introducing a new ranking measure called click gap. Click gap compares the number of outbound clicks of a link posted to News Feed against the number of clicks it receives on the web. If the outbound clicks disproportionately outnumber web clicks, Facebook, viewing that link as part of a spam activity, will demote the post to the bottom of the feed.
An algorithm change isn’t the only update to News Feeds. Context buttons, too, will appear on images posted to the feed. These context buttons are similar to the ones that appear on articles, providing information about the publisher, the author, and the number of shares.
Curbing Scams and Improving Privacy on Messenger
It’s easy to impersonate someone on Facebook. One needs to only pull an image from that person’s profile and create an account assuming the unsuspecting victim’s name. It could take months, even years, for the victim to notice the impersonation.
Although impersonation happens to regular users, it’s especially rampant among high-profile accounts owned by public figures, celebrities, and politicians. For this reason, Facebook is expanding verification badges to Messenger.
Facebook is also rolling out a few features that help users better protect their privacy. One is Messenger Settings, which allows users to control who can reach their chat lists. From this new Settings tool, they can prevent strangers, phone contacts, and Instagram followers from messaging them. Another feature is Forward Indicator, which tells users whether the message they received was forwarded to them. The last is the context buttons from News Feed, which will appear when users share articles or images to recipients on Messenger.
Better Accessibility for Community Standards
Facebook’s Community Standards outlines the rules of what users can and cannot do on the platform. Information about intellectual property, banned content, and prohibited behavior can be found there. However, the information users find on that document changes because Facebook updates Community Standards as it discovers the many new ways bad actors exploit the platform. In the past, Facebook has revised its rules on regulated goods, harassment, and hate speech in response to problematic behaviors by errant users.
When Facebook changes Community Standards too often, it becomes difficult for users to keep track of the new rules. Now, Facebook has a section called “Recent Updates,” which lists the sections that the company has revised. Facebook will include the updated sections to Recent Updates each month.
More Accountability on Group Admins
Facebook groups have provided a space for users to connect with strangers on the platform, whether the purpose is to share ideas with like-minded individuals or buy and sell used items. Group categories range from politics to business to humor to religion. Although groups have been a helpful discussion hub for users, they, like many of Facebook’s features, can be exploited.
In the past, Facebook groups have been used for illicit activity. Last year, 60 marines were under investigation for exchanging revenge porn in a private Facebook group. Because this group was hidden and inaccessible to the public, people—and Facebook—were not aware of its existence and the illegal activity that occurred within. Facebook took down the group only after a whistleblower reported the unlawful activities.
To prevent such groups to exist again, Facebook is launching Group Quality, a feature that the social media company will use to determine whether or not to take down a Group. Group Quality will provide an overview of the admins’ and members’ activities, such as the number of posts removed or flagged and the number of false news. If Facebook sees that a certain group consistently violates Community Standards, it will reduce its reach on News Feed and Search.
The past two years have not been easy for Facebook. Just when it has addressed and resolved one issue, another one takes its stead. Recently, Facebook announced a handful of changes to make its platform a safer place for users, and it’s likely there will be more updates to come.