Fake News and Information Cascasdes
Article: http://www.vox.com/new-money/2016/11/16/13637310/facebook-fake-news-explained
Recently, Facebook has gone under fire for its fake news problem and its possible effect on the recent election. Specifically, many have been quick to blame Facebook for not doing a good enough job of filtering fake news which could be taken as fact by many of its users. Although I think this is an interesting problem with a nontrivial solution, it is also worth understanding more about the people who take fake news as fact. Analyzing this phenomenon through the lens of information cascade might shed light on user activity.
Although I believe confirmation bias plays a possible larger role in how people take fake news as fact, information cascade can also contribute to the spread of false information. First to set up the problem. We can view rejecting as not sharing an article and accepting an article as sharing an article. In this setup, people are more likely to share the article if their friends have already shared or liked the article. With more people who have shared or liked the post, there is also an increase in a chance that the article will be continued to be spread. Furthermore, because there is no dislike button on facebook, most feedback left on the article is in the form of positive likes and shares. Therefore there is already an increased likelihood of an information cascade. This begs and interesting question: because leaving negative comments have a higher barrier to entry, does having a positive-leaning ‘like’ based feedback system more susceptible to information cascade? I would suspect that it does. Given the success of positive and negative feedback news aggregators like Reddit have had with filtering out fake news, it is possible that introducing a balancing negative feedback system might help Facebook’s fake news issue. However, this approach still has flaws due to facebooks echo chamber like newsfeed.