Skip to main content



The author of The Filter Bubble on how fake news is eroding trust in journalism

https://www.theverge.com/2016/11/16/13653026/filter-bubble-facebook-election-eli-pariser-interview

Eli Pariser wrote the book on the “filter bubble.” He described how a number of factors contribute to a lack of idealogical diversity in our media consumption. In this world, we are constantly being targeted based on what other sources know about us. As a simplistic example, think back to the last blog you visited. Odds are, in the sidebar, there’s a “Similar Articles” section. If that’s not there, there’s probably a “Trending Now” section designed to retain your attention and ultimately get you to focus at the advertisements off to the side. From a website owner’s perspective, it just makes sense. More page views mean more engagement with the content on social media and more ad impressions. I even have a related articles section on my personal website that I made in early high school. Ads like that can take us to another level. There’s big money in tracking users, and trying targeting us for advertisements. I spoke just yesterday with an alumni currently working at a large technical company. He told me that they are using Wi-Fi network data that they can get from users of their Android app to build a map of access points all across the world. The end goal of this is to use this data and locate users who decided not to share their location with the app. They were trying to find out the conversion rate of users who show up to events in person that they viewed in the app.

This article specifically focuses on Facebook because of the ongoing debate about the influence of fake news and filter bubbles in politics. This is very relevant to Networks because the spread of news article specifically mentions PageRank as a “a pretty good system for assigning webpages authority without having to pick winners and losers.” However, they talk about additional complexities that will soon be necessary. Now, algorithms must find a way to determine the truth behind sources because material can be popular and thus gain a lot of authority without being accurate. Companies like Google and Facebook have rapidly expanding influence as the gatekeepers to content. It’s now their responsibility to ensure that their services are not being manipulated so that they are not serving their users content designed to advance a particular agenda rather than provide valid information or entertainment. One of the interesting challenges that these companies will have to face is finding ways to decipher the truth algorithmically, without human intervention and political bias. Just as authority used to be manual directories online like DMOZ before algorithmic search took over, we have manual fact checkers. The next big problem using Networks theory will be to make a similar shift in identifying content quality.

Comments

Leave a Reply

Blogging Calendar

October 2017
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

Archives