Facebook and the Echo Chamber of Secrets

Facebook and the Echo Chamber of Secrets

Since the 2016 election kicked off, the term ‘echo chamber’ has been dropped more times than Casey Neistat has dropped his camera over the course of his vlog. The concept of an echo chamber describes a closed system, where information, beliefs, and/or opinions that comply with personal preferences are augmented and reinforced. Opposing views, however, are often marginalized or neglected from any sort of thought process or aspect of life. Being exposed to highly one-sided information that strengthens personal beliefs, while other perspectives are underrepresented, can decrease our understanding of different perspectives immensely, and even undermine our willingness and necessity to try to understand the ‘other side’.

An even greater problem arises when false information is introduced to an echo chamber. Instead of critically assessing the facts, and questioning the trustworthiness of what is presented, the information is easily accepted as long as it fits the chamber’s preferred narrative. Later attempts to ‘debunk’ these false fact are consequently ignored or further strengthen previously held (false) beliefs1. Because of my own past research during grad school, I was already familiar with the concept of an echo chamber. However, I only ever looked at it from a perspective that solely focused on traditional media, such as TV channels and newspapers. But seeing the increasing role social media, Facebook in particular, plays within the media landscape today, I thought it was time to think about the platform’s effect on echo chambers.

Today, more than 60% of millennials use Facebook as their primary news source2. Which news we read through Facebook links depends on three different aspects: a) our friends, and the content they share, b) whose content is shown to us by Facebook’s personalization algorithm, and c) which content we decide actually click on (which is dependent on parts a and b).

A 2015 study, conducted by Facebook, found that our own choices limit the range of content we are exposed to more than Facebook’s personalization algorithm3. That is hardly surprising though since people generally like to have their own opinions mirrored back at them because it provides a feeling of safety and being right. Such thinking is embedded right from the beginning of our education, where we are rewarded for being ‘correct’ and penalized for being ‘incorrect’, so we become trained from the earliest age to conform to pre-existing standards and not stray from what is ‘accepted’ in our social groups.

Several studies have examined the concept of confirmation bias, the tendency to seek out information that confirms personal beliefs, while being more easily ready to reject opposing ones. Likewise, Facebook’s study showed also that each person surveyed also has friends in their network that hold opposing socio-political views, which should mean that there is at least  the chance to encounter other viewpoints at least some of the time. However, a recent Pew Research study4 has shown, that the majority of (American) Facebook users simply ignore posts that present opposing beliefs (reinforcing part c), and some users might even block or unfriend someone for the same reason (reinforcing part a).

These findings imply that by disregarding views that differ from ours it is our own fault if we do not encounter a wider range of news stories. But it’s worth highlighting  some changes the Facebook News Feed alogorithms underwent back in June, which should ensure that users do not miss any important updates from “the friends [they] care about”5.

On the one hand, this means that the likeliness of which posts users encounter is influenced by their like-mindedness compared to and past interactions with other users. On the other hand, this also means that users will see posts by Pages, e.g. news sources, far less frequently. The key here is that both of these aspects limit encounters with opposing views even further (reinforcing  part b).

Although Facebook CEO Mark Zuckerberg repeatedly says “We are a tech company, not a media company,”6 this ignores how users treat the site and is not sufficient for pushing aside all responsibility for what content users are presented with. In fact, Facebook’s algorithm can be seen as a form of editorial practice, as Will Oremus clarifies:

“The intelligence behind Facebook’s software is fundamentally human. Humans decide what data goes into it, what it can do with that data, and what they want to come out the other end. When the algorithm errs, humans are to blame. When it evolves, it’s because a bunch of humans read a bunch of spreadsheets, held a bunch of meetings, ran a bunch of tests, and decided to make it better.” 7

Seeing how powerful and influential Facebook has become8, I feel that it is the company’s obligation, at least to some degree, to pop the existing filter bubble. The present issue is one of accessibility, where people are not given the chance to evenly encounter different views. Facebook could do a number of things to improve the current situation. The most important would be to put news stories into context. This could be made possible rather easily by providing a link to a source presents the opposite side whenever users click on a news story, at least presenting the opportunity to read an opposing stance. In a similar manner, Facebook could automatically display the corresponding headline of a news source from the other side of the political spectrum, or integrate a function that on click. The Wall Street Journal has created an online live feed that juxtasposed liberal and conservative news feeds. This enables people to view current or recurring newsworthy issues, such as on abortion or Donald Trump, from both ideological perspectives.

If you want to know more about how the algorithm operates, I recommend this article. Also, you can check what Facebook thinks you are interested in here. Political preferences are found under the lifestyle tab. Surprisingly, for a trans-national German liberal, I was labelled as US Conservative. Further, Politecho gives you an idea of how diverse your network’s perspectives are.

Lastly, I want to emphasize that I am not saying that it is Facebook’s fault alone if we only encounter one-sided news coverage. Everyone has the chance to decide what kind of news and what sources they consume. We are (somewhat) lucky to live in a time, where an abundance of news sources are available, and also accessible. This means, we can seek out accounts from different sides, as well as stories that present multiple perspectives, on our own. Further, we can evaluate the trustworthiness of authors and their sources by considering their background and possible motifs.

The point is that echo chambers are a collective responsibility to avoid, and we should do more to pop them and embrace alternative viewpoints to our own, and engage with them with civility.

  1. Quattrociocchi, Walter, Antonio Scala, and Cass R. Sunstein. “Echo Chambers on Facebook”. SSRN (2016). Web. 03 Jan. 2017. [Source]
  2. Mitchell, Amy, Jeffrey Gottfried, and Katerina Eva Matsa. “Facebook Top Source for Political News among Millennials”. Journalism.org. Pew Research Center, 14 Nov. 2016. Web. 03 Jan. 2017. [Source]
  3. Bakshy, Eytan, Solomon Messing, and Lada A. Adamic. “Exposure to ideologically diverse news and opinion on Facebook”. Science 348.6239 (2015): 1130-132. Web. 03 Jan. 2017. [Source]
  4. Duggan, Maeve and Aaron Smith. “The Political Environment on Social Media”. Pewinternet.org. Pew Research Center, 25 Oct. 2016. Web. 04 Jan. 2017. [Source]
  5. Backstrom, Lars. “News Feed FYI: Helping Make Sure You Don’t Miss Stories from Friends”. Newsroom. Facebook, 29 Jun. 2016. Web. 04 Jan. 2017. [Source]
  6. Segreti, Giulia. “Facebook CEO says group will not become a media company”. Reuters, 29 Aug. 2016. Web. 04 Jan. 2017. [Source]
  7. Oremus, Will. “Who Controls Your Facebook Feed”. Slate, 03 Jan. 2016. Web. 04 Jan. 2017. [Source]
  8. “Number of Facebook users worldwide 2008-2016”. Statista, Jan. 2017. Web. 04 Jan. 2017. [Source]

One thought on “Facebook and the Echo Chamber of Secrets

Leave a Reply

Your email address will not be published. Required fields are marked *