Facebook algorithm changes suppressed journalism and meddled with democracy
- Written by Jennifer Grygiel, Assistant Professor of Communications (Social Media) & Magazine, News and Digital Journalism, Syracuse University
Facebook’s News Feed algorithm[1] determines what users see[2] on its platform – from funny memes to comments from friends. The company regularly[3] updates[4] this algorithm, which can dramatically change what information people consume.
As the 2020 election approaches, there is much public concern that what was dubbed “Russian meddling[5]” in the 2016 presidential election could happen again. But what’s not getting enough attention is the role Facebook’s algorithm changes play, intentionally or not[6], in that kind of meddling.
A key counterpoint to the Russian misinformation campaign was factual journalism from reputable sources – which reached many of their readers on Facebook and other social media platforms[7]. As a social media researcher and educator[8], I see evidence that changes to Facebook’s News Feed algorithm suppressed users’ access to credible journalism in the run-up to Trump’s election.
Political operatives know Facebook serves as a gatekeeper of the information diets of more than 200 million Americans[9] and 2 billion users worldwide. Actions and abuse by others on the platforms have generated much concern and public discussion, including about how much disinformation and propaganda Americans saw[10] before the election. What has not been talked about enough is the effect that Facebook’s algorithmic shifts have had on access to news and democracy.
Changing the system
In mid-2015[11], Facebook introduced a major algorithm change that pivoted readers away from journalism and news[12], to deliver more updates from their friends and family. The change was couched in friendly language suggesting Facebook was trying to make sure users didn’t miss stories from friends[13]. But social media data shows that one effect of the change was to reduce the number of interactions Facebook users had with credible journalism outlets.
A few months before the 2016 election, an even bigger algorithm change toward friends and family posts[14] took a second toll on publisher traffic. A wide range of news publishers[15] found that their content was significantly less visible[16] to Facebook users.
Examining the numbers
In my research, I looked at Facebook engagement for mainstream news outlets surrounding the 2016 election. My findings support others’ conclusions that Facebook’s algorithm greatly suppressed public engagement with these publishers[17].
Data from CrowdTangle, a social media monitoring company, shows that Facebook traffic dropped noticeably at CNN, ABC, NBC, CBS, Fox News, The New York Times and The Washington Post after the company updated its algorithms to favor friends and family[18] in June 2016.
That proves the algorithm worked the way it was designed to work, but I am concerned that major U.S. publishers were suppressed in this way. Voter interest in the presidential election was higher in 2016 than in the previous two decades[19], and misinformation was rampant. Facebook’s changes meant that key news organizations across the political spectrum had a harder time getting the word out about credible election news and reporting.
Alarm bells
Facebook was aware of concerns about its algorithm even before the election happened. One of Facebook’s own engineers flagged these potential effects[20] of Facebook’s algorithm changes in July 2015. Three months later, Zuckerberg’s mentor, Roger McNamee, also attempted to alert Zuckerberg and Facebook executives[21] that the platform was being used to manipulate information about the election.
Just after the election, reporter Craig Silverman’s research at BuzzFeed showed that fake election news had outperformed “real news[22].” In late 2018, Facebook’s own company statement revealed issues with how its algorithm rewarded “borderline content[23]” that was sensational and provocative, like much of the hyperpartisan news that trended in advance of the election.
More recent research by Harvard’s Shorenstein Center shows that Facebook traffic continued to decrease significantly for publishers[24] after a further Facebook algorithm change in January 2018.
Prof. Grygiel calls for algorithmic transparency on MSNBC.
Algorithmic transparency
To date, research on how Facebook’s algorithm works has been limited by the lack of access[25] to its proprietary inner workings. It’s not enough to investigate the effects of the changes[26] in Facebook’s News Feed. I believe it’s important to understand why they happened, too, and consider Facebook’s business decisions more directly and how they affect democracy.
Recent insight into the company’s internal processes suggest that Facebook is beginning to understand its power. In July 2019, Bloomberg News revealed that the company had deployed software on its own platform to look out for posts that portrayed Facebook itself in potentially misleading ways[27], reducing their visibility to safeguard the company’s reputation.
Some international legal scholars have begun to call for laws to protect democracies[28] against the possibility that algorithmic manipulation could deliver electoral gain. There’s no proof that Facebook’s changes had political intentions, but it’s not hard to imagine that the company could tweak its algorithms in the future, if it wanted to.
To guard against that potential, new laws could bar changes to the algorithm in the run-up periods before elections. In the financial industry, for instance, “quiet periods[29]” in advance of major corporate announcements seek to prevent marketing and public-relations efforts from artificially influencing stock prices.
Similar protections for algorithms against corporate manipulation could help ensure that politically active[30], power-seeking Facebook executives[31] – or any other company with significant control over users’ access to information – can’t use their systems to shape public opinion or voting behavior.
References
- ^ News Feed algorithm (www.facebook.com)
- ^ what users see (www.slate.com)
- ^ regularly (www.ktvu.com)
- ^ updates (blog.hootsuite.com)
- ^ Russian meddling (www.reuters.com)
- ^ intentionally or not (theconversation.com)
- ^ reached many of their readers on Facebook and other social media platforms (www.journalism.org)
- ^ social media researcher and educator (newhouse.syr.edu)
- ^ more than 200 million Americans (techcrunch.com)
- ^ disinformation and propaganda Americans saw (www.thedailybeast.com)
- ^ In mid-2015 (fortune.com)
- ^ away from journalism and news (fortune.com)
- ^ didn’t miss stories from friends (www.theguardian.com)
- ^ even bigger algorithm change toward friends and family posts (www.nytimes.com)
- ^ wide range of news publishers (digiday.com)
- ^ significantly less visible (www.theguardian.com)
- ^ algorithm greatly suppressed public engagement with these publishers (www.forbes.com)
- ^ favor friends and family (wallaroomedia.com)
- ^ higher in 2016 than in the previous two decades (www.people-press.org)
- ^ Facebook’s own engineers flagged these potential effects (www.buzzfeednews.com)
- ^ attempted to alert Zuckerberg and Facebook executives (www.usatoday.com)
- ^ real news (www.buzzfeednews.com)
- ^ borderline content (www.facebook.com)
- ^ continued to decrease significantly for publishers (shorensteincenter.org)
- ^ lack of access (doi.org)
- ^ effects of the changes (www.cjr.org)
- ^ look out for posts that portrayed Facebook itself in potentially misleading ways (www.bloomberg.com)
- ^ laws to protect democracies (doi.org)
- ^ quiet periods (www.sec.gov)
- ^ politically active (gizmodo.com)
- ^ power-seeking Facebook executives (theconversation.com)
Authors: Jennifer Grygiel, Assistant Professor of Communications (Social Media) & Magazine, News and Digital Journalism, Syracuse University