Skip to main content



Diffusion, fake news, and COVID-19

https://www.scientificamerican.com/article/information-overload-helps-fake-news-spread-and-social-media-knows-it/

This article, titled “Information Overload Helps Fake News Spread, and Social Media Knows It,” discusses ideas very closely related to our class. It first opens with a hypothetical situation involving a person named Andy, who relies on his closest friends for tips on COVID-19. Due to his cognitive biases, he ends up joining an online group of people who believe that COVID-19 is a “hoax.” Andy’s experience can explain the ease of the spread of misinformation on the internet, as social media and search engines direct people to their suspicions and like-minded people. Due to an information overload on the internet, people use mental shortcuts to determine what to pay attention to, and often this information is of low-quality. Researchers in the article ran a simulation (shown in the image below) where users could post or share existing memes in a network. They found that as the number of memes (total information) increased, the quality of information decreased. Thus, even the most careful users with good intentions tend to share misinformation due to an inability to see every post in their feeds.

information overload network

 

Additionally, people tend to seek out and remember information that reinforces their beliefs, even if that information is unverified. People therefore end up in misinformed communities, since they confuse popularity with quality. The article also mentions that search engines do not help with this issue, since echo chambers amplify homophily by showing agreeable info based on the user’s data. Finally, the researchers discuss how bots can enhance or start the spread of misinformation, as they found that bots only need to infiltrate a small fraction of a network for the entire network to become a misinformed community.

This article can be discussed in terms of diffusion of information. As we saw in class, people trust their closest friends (strong ties/cluster) the most when it comes to adopting a new belief. In the case of the article, similar to the researchers, we can model a social network where nodes represent users deciding whether or not to share a fake news post. If we are considering a social network with tightly-knit clusters, we know from class that it is unlikely that people will share the post unless a person in their cluster does. If we consider the trust between friends to be the benefit of sharing the post, we can assume that a user will have a greater benefit from sharing a post from a close friend (strong link) rather than a distant friend (weak link). We can also assume that this benefit increases if the post reinforces a user’s beliefs. This supports the idea in the article that people end up in echo chambers, surrounded by like-minded people who share posts that reinforce their beliefs. Ultimately if the trust for a friend and similarity in views (benefit) are great enough, a user is likely to share the same post, even if it is fake news. Additionally, the article explains that bots can very easily infiltrate a social network. This relates to the threshold aspect of diffusion. Returning to the same model I have just outlined, if the bot shares a fake news post that reaches one or two nodes in the cluster, this can be enough to surpass the threshold of adoption for every other node in the cluster, causing a cascade of shares of this post.

The idea that fake news spreads with ease on the internet is supported by the diffusion models we have discussed in class, as adoption often depends on the relationship between neighbors and the number of neighbors who have adopted an idea (popularity) rather than the actual quality of information. This article also led me to think about communities of anti-vaxxers and eager supporters of the COVID-19 vaccine. On one hand, the anti-vaxxers are most likely in echo chambers that feed information about crazy side effects of vaccines. It is also likely that their close friends are sharing skeptical articles in regard to the COVID-19 vaccine due to the short timeline it was developed on. On the other hand, some may be so eager to get the vaccine because they are desperate to return to “normal life.” In this over-eager community, users are most likely seeing positive information about the vaccine and its efficacy while ignoring the side effects and short timeline. Both sides can be problematic, as users are essentially stuck in these tight-knit clusters where information is extremely biased. Additionally, it is unlikely that there are any strong links between users in each community, since their views differ so extremely, so neither is receiving information from the other. Ultimately, this points to a larger issue of diffusion of information where conformity is so common in clusters that the information in different communities greatly diverges.

Comments

Leave a Reply

Blogging Calendar

December 2020
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
28293031  

Archives