upper waypoint

Examining The Ways That Facebook Helped Elect Donald Trump

Save ArticleSave Article
Failed to save article

Please try again

In the wake of Tuesday's election and its stunning upset by Donald Trump, there's been a whirlwind of discussion about the media's role in the election -- but in particular, the ways in which Facebook encouraged Trump's unprecedented rise while shielding its momentum from Hillary Clinton supporters.

Of course, no single thing elected Donald Trump. But considering that Facebook is now the go-to news source for 44 percent of American adults, its role in the election is worth pursuing. Here are a few links to interesting analyses of how Facebook played a part in Trump's victory; I expect we'll see more in the weeks to come.

http://nymag.com/selectall/2016/11/donald-trump-won-because-of-facebook.html

Former Gawker editor Max Read writes an attention-grabbing headline and very nearly backs it up here, first positing that Facebook has performed a "wholesale acquisition of the traditional functions of news media" (a notion that any Google Analytics at news organizations can attest to), and then arguing that despite its great power, Facebook has failed at assuming great responsibility:

The most obvious way in which Facebook enabled a Trump victory has been its inability (or refusal) to address the problem of hoax or fake news. Fake news is not a problem unique to Facebook, but Facebook’s enormous audience, and the mechanisms of distribution on which the site relies — i.e., the emotionally charged activity of sharing, and the show-me-more-like-this feedback loop of the news feed algorithm — makes it the only site to support a genuinely lucrative market in which shady publishers arbitrage traffic by enticing people off of Facebook and onto ad-festooned websites, using stories that are alternately made up, incorrect, exaggerated beyond all relationship to truth, or all three.

As an example of fake news, Read references this article on Macedonian teenagers creating blatantly false content to dupe American Facebook users, in particular Trump supporters, to receive Google AdSense revenue each time an outbound link is clicked.

Sponsored

But in its quest to swallow every part of the internet, Facebook incentivizes media outlets to publish their content -- video, images, text -- directly to Facebook. How's that going?

http://www.nytimes.com/2016/08/28/magazine/inside-facebooks-totally-insane-unintentionally-gigantic-hyperpartisan-political-media-machine.html

Not so well. We all see blatantly biased or false stories shared on Facebook, and here, the New York Times’ John Herrman looks at the companies behind many of them. "They have names like Occupy Democrats; The Angry Patriot; US Chronicle; Addicting Info; RightAlerts; Being Liberal; Opposing Views; Fed-Up Americans; American News; and hundreds more. Some of these pages have millions of followers; many have hundreds of thousands," he writes.

And unlike traditional media organizations, which have spent years trying to figure out how to lure readers out of the Facebook ecosystem and onto their sites, these new publishers are happy to live inside the world that Facebook has created. Their pages are accommodated but not actively courted by the company and are not a major part of its public messaging about media. But they are, perhaps, the purest expression of Facebook’s design and of the incentives coded into its algorithm — a system that has already reshaped the web and has now inherited, for better or for worse, a great deal of America’s political discourse.

Importantly, Facebook isn't combatting this glut of fake news. You probably remember how, in response to the conservative outcry over Facebook's trending topics, Facebook eliminated human editors altogether.

http://www.latimes.com/business/la-fi-election-media-20161109-story.html

In the Los Angeles Times, David Pierson makes the case that considering Facebook's huge influence, it needs to start taking responsibility for what it publishes. Of course, according to Facebook, its users are doing the publishing -- shielding Facebook from libel and defamation.

“If their goal is to simply retain user engagement by reaffirming everything users already believe without challenging them, then there are real consequences. They need to own up to that,” said Gabriel Kahn, co-director of the Media, Economics and Entrepreneurship program at USC's Annenberg School for Communication and Journalism.

Kahn said the proliferation of fake news reminded him of what the late Sen. Daniel Patrick Moynihan often liked to say: “Everyone is entitled to his own opinion, but not to his own facts.”

But another problem isn't necessarily the fake stories on Facebook. It's what Facebook chooses to show you.

https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?language=en

The internet was once idealized as the free exchange of different, sometimes uncomfortable ideas; Facebook has made it a constant soothing reflection of one's own beliefs and interests. This idea is not new -- here's a TED Talk from Eli Pariser about the online "filter bubble" from five whole years ago. It's very odd to watch this week.

Essentially, Facebook's algorithm takes the posts from our already-curated circle or friends, and then filters them to show us what it thinks we want to see. (The Wall Street Journal created an eye-opening tool that presents only "red" and "blue" Facebook feeds; try it out.) One of the narratives that arose quickly after Tuesday night was the so-called elite urban left's inability to engage with the disenfranchised rural right. On an individual level, this means Clinton supporters felt safe in their Facebook bubble, unaware of the size and passion of Trump supporters.

So when your Democratic friends take pundits to task for not accurately predicting Trump's chances of winning -- if only we'd known, we might have fought harder, they think -- they also have to re-think Facebook's role in nurturing their feeling of security. According to their timeline, there was no threat. Trump was going to lose. Everybody on their timeline said so!

That might have been good news for them. Meanwhile, on the other side of the divide supported by a feel-good algorithm, Trump supporters consumed their own news -- in varying degrees of truthiness -- on Facebook.

q-logo_-break

UPDATE: Mark Zuckerberg has responded. NPR's Aarti Shahani reports:

http://www.npr.org/sections/alltechconsidered/2016/11/11/501743684/zuckerberg-denies-fake-news-on-facebook-had-impact-on-the-election

And the New York Times reports on the concern within Facebook's upper ranks about the company's influence on the election, and the all-hands meeting held for staff following Trump's victory:

Sponsored

http://www.nytimes.com/2016/11/14/technology/facebook-is-said-to-question-its-influence-in-election.html

lower waypoint
next waypoint