Skip to main content



Information cascades and the decline of cooperation in social networks

Sequential decisions within networks, like social media networks, can be modeled in a way that makes them easier to analyze. One important (and interesting) thing that can be found with such an analysis is the occurrence of information cascades. We can think of a chain of sequential decisions (for example, to follow someone or not to follow someone, to share a post or not) as influenced by both private information and public information. In the case of sharing a post, the public information might be the number of people who previously shared the post, and the private information might be a feeling that you like the post. Individuals making a decision, in this setting, have access to the decisions of other individuals making the same decision, and also private information of their own, a “signal”. An information cascade occurs when a chain of people make the same decision, one after another, relying only on the public information available and disregarding their own signal.

One potential feature of information cascades is the cascading decision being wrong. Some number of people may have a signal that points to the correct decision, but because they have access to only the decisions made by the people before them and not their signals, they cannot know the true proportion of people who had information pointing to a different choice, and they cannot factor that into their decision. Therefore, the incorrect decision propagates. Yang et al explore this in the context of newcomers to a social network. Their model is of a social network with “cooperator” nodes and “cheater” nodes – people willing to cooperate with other nodes, and bad faith actors. A new node joins the group, picking an existing member to model behavior off of. They have private information and public information; knowledge of the connections in the group. Ideally, this new node would pick a “cooperator” node to copy. But if a cheater node is highly connected, new nodes might model off that instead, following a self – reinforcing chain of bad decisions (as more nodes connect to a cheater node, the cheater node becomes more highly connected and thus a “better” role model, at least via the public information available) and eventually destroying the network’s willingness to cooperate.

This sort of model is especially intriguing in the context of that negative cascade. Bad information can spread virally online through just that sort of self-reinforcing chain; people reblog a post, more people see the post, see the number of people who have reblogged it, rely on that instead of on whatever private information they might have about the post, and spread it further. It’s fascinating to see that there’s a rational basis for the spreading of misinformation beyond bad online literacy or stupidity, the explanation that I usually see. This also indicates that the most effective and perhaps only effective method to stop the spread of bad information through a network, or anti-cooperative behavior, is to interrupt the cascade in one way or another. For example, banning bad actors and/or removing the connections that make them potential role models, or taking down misinformation entirely, taking the choice out of the hands of the people that it might reach. Either way, negative information cascades will only continue to be more prevalent, and the networks that we communicate in need to consider that and take precautions, or face the destruction of trust in the network.

 

Yang, G., Csikász-Nagy, A., Waites, W., Xiao, G., Cavaliere, M. 51462286000;6506591969;53165099300;7201917092;23097385600; Information Cascades and the Collapse of Cooperation (2020) Scientific Reports, 10 (1), art. no. 8004, . https://www.scopus.com/inward/record.uri?eid=2-s2.0-85084730420&doi=10.1038%2fs41598-020-64800-z&partnerID=40&md5=6b5db506285629a23cc8e5529568bd2e

Comments

Leave a Reply

Blogging Calendar

November 2020
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
30  

Archives