Skip to main content



Facebook Struggles with “Filter Bubbles”

Facebook has been receiving a lot of attention for the potential role they may have played in the 2016 U.S. Election. Researchers are claiming that the social media giant likely influenced the outcome of this election through its widespread distribution of fake news stories. However, the underlying issue with the spread of these news stories can be seen in the idea of “filter bubbles.” Filter bubbles “are formed by the algorithms social media sites like Facebook use to decide which information to show you, based largely on your own tastes.” Their purpose is to keep users engaged by providing them with content they find interesting or relatable. The unforeseen consequences of these bubbles is the potential for bias to skew the worldview of the 62% of Americans who get their news from social media. This leads to a fear that eventually the content provided by these filter bubbles will influence your real life interactions and decisions.

The author of this article claims that a solution would be for Facebook to allow users to “flip their feeds” essentially showing the opposite viewpoint of what their feed does. Other solutions include providing highly rated news stories on everyone’s feeds, regardless of their leaning or opinion. The author explains that another issues arises from the fact that all sources are presented in the same way, “whether you’re looking at The New York Times or your neighbor’s blog” it all looks the same. Facebook needs to create a distinction between various sources so that opinions and false information cannot be viewed in the same way as legitimate news. The solutions provided are interesting but might be difficult to implement if they rely to heavily on individuals willingly agreeing to observe an opposing viewpoint.

These bubbles operate similarly to the networks we’ve been learning about in class. The information you receive is determined by who you friend, unfriend and what you like, thus by removing people who you disagree with you “add more boundaries to your reality.” This relates back to the idea of weak ties being a good source of information. You receive your information from Facebook friends and liked pages and by removing the weak ties you disagree with (unfriending people on Facebook), you are limiting yourself to a biased stream of information. The way we share news on Facebook can also be viewed from the lens of cascading behavior. When one or two of your friends share a news story, you’re unlikely to share that story yourself. However, when a large number of friends share a news story, you are much more likely to also share that news story. This can be attributed to the idea of the threshold rule which states that if a certain number of your neighbors follow a behavior than you will too. In the case of Facebook, the friends on your news feed are your “neighbors” and sharing a news story is a “behavior.” Once a cascade starts where the entire network begins sharing a fake news story it can clearly be seen where issues start to arise.

https://www.newscientist.com/article/2113246-how-can-facebook-and-its-users-burst-the-filter-bubble/

Comments

Leave a Reply

Blogging Calendar

November 2016
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
282930  

Archives