Social Media’s Filter Bubble and Information Cascade
One day, I came up with a random idea and searched for “dog treats” online. After clicking through several links, I found the idea not attractive and didn’t buy anything. But for the next few days, dog treats advertisements are everywhere – in my social media apps, online shopping platforms, and even my mailbox. Many of us have the same feeling that technology companies are learning our preferences and showing us what we want to see. It can be a bit annoying sometimes, but isn’t it a good thing in general? The ads would be helpful if I really wanted to buy something for my dog. Even if I’m not, it is probably still more attractive than other completely random ads. The social media companies are just trying to give me content that I’m more likely to be interested in and click through. Why do we need to worry about this?
One of the reasons can be this “algorithmic filtering bubble” (using algorithms to predict one’s preference and select contents to serve) can enlarge the effect of information cascade. First, let’s talk about what information cascade is. The model is that a group of people are making decisions sequentially, and everyone can observe their own signal and the decisions people previously made. Though players don’t know others’ observations, they can sometimes infer from others’ decisions. An information cascade can occur when a series of the same decisions are observed, and from the inferred observations, no matter what the next person observes, following that decision has a higher expected payoff will. That means, even if the next person is completely rational, his/her decision will not be influenced by his/her observation. Once a cascade forms, it is difficult to break because every person after that will face the same situation and will not be influence by their own observations.
However, in reality, we usually aren’t able to see everyone’s decisions. In social media apps, such as Facebook or Twitter, we only see contents of those we follow or recommended for us “based on your interest”. What’s more, we naturally tend to become friends with those who are similar to us, sharing the same thoughts or background. Therefore, one’s social media feed is likely to be more homogeneous than the real situation, and we might sometimes incorrectly think something is dominant while it is only dominant within a small cluster near ourselves.
In the information cascade model, if the signal is not completely random, but its interpretation is affected by one’s own opinion. Then the filtering bubble can largely affect one’s decision and create a “fake cascade”. The following figure shows a simple network with two clusters: Blue and Red. The Blue’s all interpret their signals as Low while the Red’s interpret theirs as High. Each person can only see the decisions made by their connections. In this case, person 1, 2, 3, and 4 are all influenced by their own observations (Their decisions would be Reject, Accept, Reject, Accept respectively). For us, since we can see the whole picture, we know the H and L signals are pretty even, and no cascade has occurred yet. However, things are a lot different for people within the clusters. Person 5 observes RR, and will thus choose R no matter what he/she observes. Similar for person 6, who observes AA, and will always choose A. Now, information cascade happens in both clusters and in opposite directions!
In reality, networks are often much more complicated and we can always talk with our friends to ask what exactly the observe. However, it is possible that this kind of “fake cascade” can occur within our social network. Social media apps will try to send us articles we agree with, and posts from our friends who share the same opinion with us. This can potentially lead us to incorrect thoughts or decisions. And it is often hard to detect. Is there anything we can do then? A lot! For example, read from various sources, and talk to different people. It is important to simply recognize different opinions exist and don’t let the cascade fool you.