OCT. 19, 2020
In 2016, Facebook came under fire for allowing fake and politically divisive Russian ads to overwhelm the social platform. It was suspected that this was done to disrupt and influence the election.
For the 2020 election, Facebook is attempting to right these wrongs, and in addition to committing to only authoritative, clear information related to the election, they also want to encourage voting.
Here are the changes that Facebook is making to prevent election interference this year.
Facebook’s Promise to Users
According to its website, Facebook’s goals are to “secure the integrity of the US elections by encouraging voting, connecting people to authoritative information, and reducing the risks of post-election confusion.” To do that, they have a four-part plan:
- New political ads will not be accepted during the week leading up to the election because there isn’t enough time for fact-checkers or journalists to contest new claims during that last week before the election. Ads that have already been running can continue running, though, and targeting can be adjusted for those ads.
- Any posts that say you’ll contract COVID-19 if you vote (presumably in public) will be removed. Furthermore, on posts that use COVID-19 scare tactics to discourage people from voting, Facebook will add links to authoritative sources.
- On posts claiming that voting methods are illegitimate or fraudulent, Facebook will attach an informational label. The same goes for posts that say the outcome of the election may be illegitimate due to voting methods.
- Should a campaign or candidate declare victory before the official results are in, Facebook will add a label that directs people to the National Election Pool and Reuters, where they can see official results. This is especially important for the 2020 election because the COVID-19 pandemic means a lot of voters will be mailing in their ballots, and the final result may come later than election night.
The way that Facebook can handle harmful content or misinformation in ads differs from how posts are handled. It’s possible that some ads won’t get approved or shown at all, but when it comes to posts, they may be adapted by Facebook after being posted, since there’s less control over posts than ads.
How Facebook Is Helping Voters Register
According to a Facebook post from CEO Mark Zuckerberg, “Facebook is already running the largest voting information campaign in American history — with a goal of helping 4 million people to register and then vote. In just three days, we already drove almost 24 million clicks to voter registration websites.” To further encourage voting, Zuckerberg said that Facebook will “fight misinformation” and “connect people with authoritative information.” Now, when you log in to Facebook or Instagram, you’ll see information toward the top of the page that directs you to voting information.
According to Zuckerberg, “We will put authoritative information from our Voting Information Center at the top of Facebook and Instagram almost every day until the election. This will include video tutorials on how to vote by mail, and information on deadlines for registering and voting in your state.” Facebook is also removing misinformation about voting that would prevent someone from properly voting. For example, claims that you can mail in a ballot after election day, which isn’t true.
Stopping the Spread of Misinformation
That Facebook post from Zuckerberg also talked about their efforts in limiting the spread of harmful content and misinformation. Forwarding on Messenger is limited, which means that while a user can share election-related information, there’s a limit on the number of chats a message can be forwarded to at one time. (This strategy has already been used on WhatsApp in an “effective” way.) It also seems that this will limit how much one user can push their opinions and beliefs, whether legitimate or not, on others.
What About Those Russian Ads from 2016, Though?
Zuckerberg didn’t shy away from referring to that situation in his post, saying, “…four years ago we encountered a new threat: coordinated online efforts by foreign governments and individuals to interfere in our elections. This threat hasn’t gone away.” He went on to say that, within just one week, Facebook took down 2 pages and 13 accounts that had intended to “mislead Americans and amplify division.”
Aside from that, though, the rest of the statement was pretty vague, with just a blanket promise that Facebook’s security systems are more sophisticated than before. Since that was such a major issue in the past – and a huge concern in the present – it’s not altogether encouraging that little else was said about it.
It seems that those informational labels Facebook will be attaching to potentially harmful content are intended to negate the influence those posts can have on viewers. And since COVID-19 is such a hot-button issue right now, it stands to reason that Facebook expects a lot of misinformation to surround the virus, and their new efforts will try to thwart that.
Getting Paid to Deactivate Your Account
One last thing: There have been a lot of reports about the $120 payment for deactivating (not permanently deleting) your Facebook account pre-election. Some users saw a survey that asked how much they would want to be paid to deactivate their Facebook or Instagram account for a few weeks leading up to the election. Some users were shocked, while others felt that this sort of deal was par for the course when it came to academic research experiments. Also, the price range went up to $120, but started at just $10, and compensation options, as well as deactivation time frames, were chosen by users.
So Facebook is now going to pay people to deactivate their IG and FB accounts before Election Day. It’s part of the research experiment announced Monday but WOW. This notice went out this week. pic.twitter.com/tV7DAw8F5I
— Elizabeth Dwoskin (@lizzadwoskin) September 3, 2020
This survey and research participation are part of Facebook’s overall look into how social media impacts elections. And it’s an interesting offer for people who are happy to steer clear of the abundance of political information on the platform right now.
By Lindsay Pietroluongo