Bias at Facebook Didn’t Just Start with the Election
http://www.nytimes.com/2016/05/19/opinion/the-real-bias-built-in-at-facebook.html
Over the past week, numerous news articles and blog posts have focused on the election and fake stories propagated by Facebook. In this blog post, I will examine information cascades and how Facebook’s algorithms were viewed even a few months prior to the election, and how these predictions were ultimately accurate in portraying Facebook’s influence. In “The Real Bias Built In at Facebook,”published in May of this year, the opinion writer describes algorithms are inherently biased, or at the very least, not neutral. For instance, if a hiring algorithm puts this person’s résumé at the top of a file, it is difficult to objectively say what is right and what is wrong.
Additionally, with machine learning, we can have computers determine their own course of action. As the writer says, “while we now know how to make machines learn, we don’t really know what exact knowledge they have gained. If we did, we wouldn’t need them to learn things themselves: We’d just program the method directly.” Thus, computers have biases that we might not even know about. “Like”-ing or commenting on a post will increase a post’s visibility on Facebook; however, this is not favorable for topics that are difficult to “like” or comment on. What we see on our newsfeeds is shaped by “algorithms, which are shaped by what the companies want from us, and there is nothing neutral about that”
These ideas closely align with what we have learned in class. Our network of Facebook friends is likely our networks of friends and colleagues. What we see on our newsfeeds is likely shaped by what others in network have liked, shared, and commented on. Aligning your beliefs with others, based on what you see on Facebook, can have direct-benefit effects. You can in turn share similar articles with them and receive likes as their affirmation, since your beliefs align with theirs. Your social interactions with them could improve as you have topics to talk about and agree upon together. Additionally, you could view your associations with others as have informational effects if you believe that your friends are more well-informed on current events and thus trust and adopt their opinions.
Understanding how Facebook’s algorithms are inherently biased and information and opinions diffuse in a network are key to fixing the problem of living in a social network bubble. Hopefully through this understanding we can discover better solutions for the future.