Skip to main content

Algorithms and The Movement of Ideas

This article discusses how Facebook, Google, and Twitter try to contain hate speech without infringing on free speech. Furthermore, it touches upon the stall in the flow of ideas—especially ideas pertaining to each political party.  It primarily focuses on Facebook’s algorithm for what pops up on a person’s news feed. Their algorithm is designed to make them the most amount of money (obviously, as Facebook is a business). This entails that Facebook wants its users to spend more time scrolling down the newsfeed. To prevent people from exiting out of Facebook, Facebook ensures that the content presented to you is stuff that aligns with your views—stuff you find funny (like memes), stuff you find heartwarming (like soldier homecomings), and stuff that fits your political views (like right/left political videos). This is the problem that this article explores. By molding a user’s newsfeed to show only news and articles that align with their personal views, that user misses out on new ideas and possible solutions presented by people who have slightly (or radically) different views. Political parties have turned this into “social media represses free speech” campaign, which in all actuality, could be true.


In order to illustrate how damaging this algorithm can be to our society, the article touches upon two network theories: people form groups of people similar to themselves and information gets passed easily within a group, but much more difficulty across groups. Various topics we’ve discussed in lecture can explain these two phenomenon.


The first theory – people form groups of people similar to themselves—can be explained with the Strong Triadic Closure Property. Say we have a group of people who all have at least two friends. They made these friends (strong tie) because they have similar interests and perhaps political interests as well. By the Strong Triadic Closure Property, if a node has two strong ties to originally unconnected nodes, then those two unconnected nodes will form either a weak or strong tie. As time goes on, people meet more people who share similar interests from this way.


The second theory also relates to diffusion of ideas and technology. One would assume that new ideas would spread quickly across weak ties because people tend to get new information from weak ties (as those weak ties have connections to people that the original person doesn’t have and therefore has access to different information). However, new ideas are unique. Ideas must be accepted; they must be analyzed and vetted by the people hearing them for the first time. There are times when people don’t accept the idea and times when they do. Thus, we cannot model the transfer of ideas simply as a transfer as information. Rather, we must model them as a diffusion of technology. For an idea to spread/ be accepted, the likelihood of someone adopting it must reach a certain threshold. This relies on two things: probability, p, of someone accepting the idea and the number of friends who have already adopted this idea. In groups where everyone thinks similarly/has similar opinions, p is high. q can be high as well depending on the group size. Thus, the likelihood of the idea spreading is high and most often reaches the threshold. Therefore, ideas spread easily and rapidly in groups. However, once you travel outside of a group, ideals change. p may drop drastically for this other group and as a weak link, that weak link will not have many ties to other people in the original group, so they cannot be as easily persuaded to adopt the new idea. Thus, the transfer of ideas will halt at weak ties. What does this entail? This entails that in our divided political system, Democrat ideas easily spread among Democrats and Republican ideas easily spread among Republicans, but the spread from one to the other is limited. Unfortunately, this could lead to unsolved problems due to the lack of innovative ideas.


I understand that Facebook is a great place to share ideas and political leanings, but one would be foolish to think that it is the only reliable source of information. For people who truly are interested in the state of our country and current affairs, it would be best to look at different news sites that are less biased. So, while it’s true that Facebook’s algorithm may not facilitate the spread of new ideas, it’s wrong (as the article does) to just blame the algorithm. Most people I know search elsewhere for news meaning that the lack of flow of ideas is not manufactured from some digital logic, but is more engrained in us as humans. We are quick to fight, but perhaps what’s needed is open minds and calmer tones.



Leave a Reply

Blogging Calendar

November 2017
« Oct   Dec »