For years Facebook has struggled to combat misinformation. An open and largely unregulated platform frequented by 2.3 billion users, Facebook struggles to keep its platform clean. Fake news disseminated by Russian agencies back in 2016 wasn’t the first instance of misinformation occurring in a large scale, and it certainly isn’t the last.
Recently, Facebook updated its algorithm to curb the spread of content that promotes misleading health claims. The update reduces organic posts with exaggerated or sensational health claims and posts that sell products or services based on those bogus health claims.
To determine whether a post should be demoted, Facebook answers these two questions:
- Does the post talk about health, and if so, does it exaggerate or mislead? An example of this would be falsely claiming a miracle herb to cure a slew of diseases, such as diabetes, cancer, and Alzeihmer’s.
- Does the post attempt to sell a service or product based on the sensational health claim? For example, a post attempting to sell a miracle pill that cures cancer.
According to Facebook’s announcement about the updated ranking, Facebook’s AI will scan posts for phrases identified to be commonly used in false health posts. Facebook’s AI will then place the posts predicted to include sensational health claims lower in News Feed.
Facebook and Health Misinformation
Facebook’s increased efforts to curb sensational health posts come on the heels of controversy that occurred back in March when vaccine misinformation circulated in the platform. Many of these vaccine-related posts, reports CNN Business, promoted anti-vaccine rhetoric that spread false information.
In response, Facebook took several steps to reduce the distribution of content that misinformed users about vaccination:
- Reducing the rank of groups and pages that spread misinformation about vaccinations in News Feeds.
- Rejecting ads that include misinformation about vaccination and disabling the ad accounts of repeat offenders
- Hiding content with misinformation from Instagram Explore and hashtag pages
- Removing access to fundraising tools from pages that spread misinformation about vaccines.
- Removing violating groups and pages in search results
Facebook has enlisted the help of the World Health Organization and the US Center for Disease Control and Prevention to identify vaccine hoaxes. Facebook will take action against violating posts when they appear on the platform.
Sensational Health Claims in Facebook’s Ad Policy
Sensational health claims have long been prohibited in Facebook advertising. It is only now that Facebook is extending the limitations to organic posts.
Currently, Facebook’s ad policy prohibits content promoting unsafe supplements. The policy states, “Ads must not promote the sale or use of unsafe supplements, as determined by Facebook in its sole discretion.” Examples of these supplements, says the ad policy, include anabolic steroids and human growth hormones.
Facebook’s ad policy also prohibits sensational content, ads with images or copy that aim to shock, disrespect, scare, or depict excessively violent content. Of course, misleading or false content is also prohibited, as all of the ads, landing pages, and business practices “must not contain deceptive, false, or misleading claims, offers, or methods.”
In regards to personal health, Facebook has a slew of prohibitions, including ads that contain before-and-after images, unexpected or unlikely results, or attempts to generate negative self-perception in its target audience.
While Facebook takes down ads that contain sensational health claims, the social media company only goes as far as to place sensational health posts lower in feeds. Because paid ads get more reach, the consequence is much higher compared to organic posts.
Back in 2016, a study found that false health information were more likely to go viral than posts with accurate health claims. Three years later, Facebook still struggles to combat misinformation in its platform, and it seems that the fight will be an ongoing one.