Skip to main content



Filter Bubbles and Information Cascades

Within an filter bubble on social media, the harmful effects of an information cascade can be even stronger. While customization features on social media and search engines can be quite helpful by providing efficient access to the most relevant information for a user, they can also be extremely damaging to the fabric of a democracy. A general model of information cascades is contingent upon the fact that people can see what other people do, but not what they know. To use Facebook as an example, we will say that liking or sharing a post is an indication of accepting an option and disliking or leaving a bad comment on a post is an indication of rejecting the option. People do not always know why people are accepting the post, but what they do know is that they choose to positively interact with it. For an individual (person 3), if they see that a user they follow (person 1) positively interacted with content, that is a sign of acceptance. If another user they follow (person 2) also positively interacted with that content, that is another sign of acceptance. Based on the Sequential Decision-Making model given in Chapter 16, this could lead to a cascade where the individual (person 3) would assume that person 1 and person 2 had private signals that informed their decision to accept the post, which would influence their own decision, especially if person 3 does not have a strong knowledge base to be able to form their own opinion about the post.

While an information cascade can threaten the verifiability of information even when people are exposed to many different opinions, it can be even more dangerous in the context of a filter bubble. When social media platforms, such as Facebook, customize feeds so that users mainly see information that aligns with their thinking in order to keep users more engaged, the likelihood of an information cascade is even higher because people are surrounded by others who have similar probability of accepting or rejecting information, which reinforces their own opinion. Furthermore, people in separate filter bubbles are exposed to different information cascades that support their personal ideology, with little overlap. While the same news headline may be shown to everyone, depending on which filter bubble you are in, you will see mostly the same reaction to it, which reinforces your own thinking and limits exposure to other ideas. The exposure to opposite opinions is crucial for our democracy and threatens the search for truth in the age of misinformation. It is up to social media and search engine corporations to decide the point at which at the efficiency of accessing relevant information is worth sacrificing the strength of our democracy.

https://medium.com/@10797952/the-causes-and-effects-of-filter-bubbles-and-how-to-break-free-df6c5cbf919f

Comments

Leave a Reply

Blogging Calendar

November 2021
M T W T F S S
1234567
891011121314
15161718192021
22232425262728
2930  

Archives