Mark Zuckerberg appearing live on Facebook from his own personal Facebook account
On the afternoon of September 21, days after news erupted that Russian ads connected to the Internet Research Agency interfered with the 2016 US presidential election, Mark Zuckerberg went live from his personal Facebook account. Donning a brown t-shirt and a somber expression, Zuckerberg outlined his plans to prevent future Russian interference. He proposed more advertising transparency, strengthened review processes, and increased investment in security. As noble as Zuckerberg’s efforts, his seemingly unassailable plan has one problem: it won’t work.
An attempt to manipulate American minds has happened before Facebook announced the existence of 3,000 fake Russian ads. In late November, a Buzzfeed reporter discovered nearly 140 fake news sites, months leading up to the 2016 election. The news sites, which originated from Macedonia, disseminated false clickbait news to accrue ad dollars. The articles were so effective that they garnered hundreds of thousands of shares, reactions, and comments. The reason they succeeded on Facebook? Irresponsible readers who didn’t fact-check, who shared articles that they didn’t read, assuming the headlines to be honest and accurate.
Readers’ negligence isn’t the only reason Mark Zuckerberg’s plan won’t work. Ignorance also stands as a problem. A study from the Stanford History of Education Group, which included 7,804 middle school, high school, and college students from 12 states, found that most of the participants couldn’t differentiate between fake news and real news.
Enraged citizens quickly blame Zuckerberg for Russian interference. “You are a traitor, sir,” commented one user. “Why are you selling ad space to Jew haters?” said another, referencing the anti-Semitic ad targeting options that mistakenly appeared on advertisers’ targeting fields.
Zuckerberg may be the obvious scapegoat, but he isn’t the only one to blame—we, the American people, are to blame, too. We let emotionally charged stories fuel our urges to share without fact-checking. We argue on comment sections, catapulting fake news to the top of feeds. We hungrily devour articles, factual or not, that suit our biases and regurgitate them to the world. We scroll past fake news on social media, neglecting to report them.
Zuckerberg may have created steps to prevent future Russian interference, but those steps won’t work unless we change.