Facebook and Google’s Effects on Political Opinions
How Facebook and Google’s Algorithms Are Affecting Our Political Viewpoints
Recent studies have shown that the algorithms used by Facebook and Google to display relevant content to users can skew users’ political opinions. The Facebook News Feed uses a complicated algorithm to decide what posts a user may want to see. This algorithm examines many different attributes of each post, such as whether the user has liked previous posts from the same person. One effect of this curation is that users are more likely to see posts supporting their political views than to see posts with alternate political views.
This makes sense when considering the properties of networks. A user’s close friends are likely to share the same political views; this may be part of the reason why they are close friends. In addition, a user interacts with close friends more frequently than with acquaintances. As the article mentions, the News Feed algorithm weights posts in part on how much a user interacts with their authors. A user is more likely to see posts from close friends, whose timelines the user has commented on, than from acquaintances, who may only receive an occasional like. The outcome of this is that users more often see posts from people sharing similar political opinions.
This dynamic also exists in Google search, but through an information network rather than a social network. As the article states, similar to Facebook, Google’s search algorithm uses various factors to display relevant results. Although some of these factors involve the links between webpages, others are more specific to particular users, such as the type of computer they are using. Most interestingly, the article reveals that a user’s location is a very important factor, and it is easy to see how this might affect the types of political results a user sees. Google tries to find the results a user is most likely to be looking for, so it would make sense for someone in a predominantly liberal area to see predominantly liberal results, for example.
These kinds of algorithms raise interesting questions about the flow of information and how certain features of a network can lead to users being constrained to their section of the network. The concept of personalization, showing users what they most likely want to see, often leads to a better experience, but it can also affect their opinions by withholding information they disagree with. Whether this is a good thing or a bad thing could become a significant argument as online networks continue to be more and more prevalent in everyday life.