A Digital Sounding Board: The Internet and Filter Bubbles

By Clare Nolan

Most of us have some sort of daily routine with the Internet. For example, every morning, I get up, take a shower, and then settle down in front of my computer for fifteen to twenty minutes of Internet browsing before I get ready for the day. I check my Facebook, I read the webcomics I follow, I look at my e-mail, I peruse some blogs, and I scan through viral images. Instead of morning coffee, I start my day with a blast of information. But does that blast of information contain a wide range of material from across the Internet or is it made up of content that’s been tailored just for me?

The fact is that the Internet that we see is not pure, unaffected information, but information that has gone through a variety of filters that have been placed in order to ensure that we, as users, will receive the kind of information that we most want to see. These filters involve advertisers, social media companies, search engine developers, and even self-imposed filters that we might not even be conscious of. Together, all of these filters form what author Eli Pariser refers to as “the filter bubble.” In his 2011 TED Talk, Pariser defines the filter bubble as:

Your own personal, unique universe of information that you live in online. And what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out. 1

Sites like Google, Facebook, and even various news sites are in the business of making sure that the content that appears on your screen is the exact content that you want to see. The reason is simple: the more they display content you want, the more time you’re going to spend on their websites, and the more they’re going to profit from the ad revenue that comes from each page you click on. They don’t have an interest in providing you with a diverse array of information, only the information that will make you stay on their site. The result is that many users find limited information, or information that merely supports what they believe/what they want to hear, when they are under the impression that they are receiving unfiltered information. Robert W. McChesney writes that filter bubbles: “keep us in a world that constantly reinforces our known interests and reduces empathy, creativity, and critical thought.”2

Filter bubbles are constructed in a variety of ways, and are designed to keep users reliant on a certain service. Pariser’s initial example is Facebook, where your News Feed is tailored based on the people you interact with on the site. Pariser tells the story of how he started to notice that his friends who posted links to politically conservative information started to vanish off of his News Feed, while the friends who posted links to politically liberal information remained.3 This is because Facebook employs an algorithm that tracks how often you interact with certain people (clicking links, liking posts, commenting, etc.) and prioritizes your News Feed based on those interactions. For Pariser, who is a self-descried liberal, this meant that his liberal friends stayed on his News Feed because he would more often interact with their posts then he would his with those of his conservative friends. He was exposed less and less to opposing view points, meaning he was offered fewer chances for debate and fewer opportunities to learn something from an unfamiliar source. This does not mean that everything people online say has value, or that all of the conservative information would even be worth Pariser’s time. However, the idea of the Open Internet where all information can be accessed equally is not the Internet we have if sites and advertisers put on our content put more and more filters on our content.

According to Facebook, the News Feed is designed this way because the large numbers of Facebook friends that users have would make the News Feed unwieldily otherwise.4 However, the negative is that users are being exposed to less information that might challenge their way of thinking, and exposed to more information that supports what they already believe. Similar algorithms and personalization techniques are used on Google, and news sites like Huffington Post, Yahoo News, Washington Post, and the New York Times.5 With all of these filters, how can we consider opposing view points? How can we engage in discussion? How can we learn anything? Pariser argues that because filter algorithms respond to what a user clicks on, that users eventually will only get content that satisfies their immediate wants and whims when online, rather than pushing them to think further. He says:

The best editing gives us a bit of both [thoughtful content and fun content]. It gives us a little bit of Justin Bieber and a little bit of Afghanistan. It gives us some information vegetables; it gives us some information dessert. And the challenge with these kinds of algorithmic filters, these personalized filters, is that, because they’re mainly looking at what you click on first, it can throw off that balance. And instead of a balanced information diet, you can end up surrounded by information junk food.6

The problem is, content providers, search engines, and advertisers don’t necessarily see a reason to provide users with the sort of Internet that offers them both ‘vegetables’ and ‘desert’. The system currently in place makes money, and as long as these companies are profiting, they have a limited investment in what content their users are consuming.

The more that individuals are exposed to views and information that validates and enforces their current world view, the harder it becomes to converse with others about those views, especially in a digital format. Filters help convince users that their opinions are more valid than the opinions of others, and people start to create online communities where little debate is welcomed and users mostly share the same opinions. People who don’t share those opinions might be engaged in debate, but a debate that happens face-to-face is a very different kind of debate than the kind that often happens online. So much of Internet debates boil down to people slinging insults, shutting other people down, or overusing the caps lock to make their point. Online discourse is so commonly difficult that if you search “arguing on the internet” you get a slew of images mocking the idea of online debate.7 If online users were exposed to a wider variety of content that challenged their world views, would the nature of online debate change?

Webcomic artist Cameron Davis’ interpretation of online debates. From my own experiences, this certainly doesn’t apply only to men.

The good news is that while content providers and advertisers may not be interested in popping the filter bubble, there are ways that Internet users can lessen the effects that filter bubbles have on their online experience. Pariser’s website, The Filter Bubble, has a list of ten ways to reduce the effect of the filters. These techniques include deleting cookies and browser histories, setting stricter privacy settings, using browsers and sites that allow users to access the internet without providing their IP addresses, and depersonalizing browsers.8 The other helpful thing is to make users aware of the filter bubble. We might be stuck with filters, but if we are aware that they are there and what they are doing to our online experience then we can compensate for those effects and search out information that we might not normally find otherwise. The internet may be a fantastic source of information, but if we do not utilize it properly, what’s the point of having that information source in the first place?

  1. Pariser, E. (Feb. 2011) Eli Praiser: Beware online “filter bubbles”. (Video file). Retrieved from  http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/
  2. McChesney, R. W. (2013). Digital disconnect: How capitalism is turning the internet against democracy. The New Press: New York.
  3. Pariser, E. (Feb. 2011) Eli Praiser: Beware online “filter bubbles”. (Video file). Retrieved from  http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/
  4. Hicks, M. (2010, August 6). Facebook tips: What’s the difference between top news and most recent? (Web log post). Retrieved from https://www.facebook.com/notes/facebook/facebook-tips-whats-the-difference-between-top-news-and-most-recent/414305122130
  5. Pariser, E. (Feb. 2011) Eli Praiser: Beware online “filter bubbles”. (Video file). Retrieved from  http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/
  6. Ibid.
  7. Though to be fair, this is possibly affected by Google’s filters on my search.
  8. Pariser, E. (2011) Ten ways to pop your filter bubble. Retrieved from http://www.thefilterbubble.com/10-things-you-can-do
Creative Commons License
This work is licensed under a Creative Commons
Attribution-NonCommercial 3.0 Unported License
.

WordPress theme based on Esquire by Matthew Buchanan.