Skip to main content



How Online Bots Cause The Spread of Misinformation

2020 has proven to be a year that many of us are ecstatic to leave behind and finally call the past. Although we are less than a month away from closing this chapter in the history books, it’s imperative that we analyze what happened in 2020 and, most importantly, learn from it. The saying “those who do not learn from history are doomed to repeat it” continues to hold true time and time again. What has primarily fascinated me this year is the so called “fake news” we hear so much about in the media and even in our college lectures. From voter fraud in the presidential election to millions calling the Covid-19 pandemic a hoax, fake news has been more prevalent than ever and as a daily consumer of news and media, it is becoming progressively more difficult to decipher what information is and isn’t factual. This is largely due to bots or fake accounts sharing or reposting false information over and over again.

One article from Scientific American titled Information Overload Helps Fake News Spread, and Social Media Knows It delves into numerous ways in which false information is spread and how it’s almost impossible for us to keep up with what is really true and what is really false. I’m focusing on the section pertaining to bots as I believe it perfectly ties into the economic concept of diffusion that we have examined in class. We understand diffusion to be the process in which a new idea or new product is spread and accepted by a new audience. Furthermore, our knowledge of information cascades reminds us that people often make decisions by observing what those around them are doing and will not hesitate to abandon their own signal if enough signals around them point a different way. This idea is what makes online bots so powerful in spreading information. Take a second and think about your social media. If every time you logged in to your Instagram you were flooded with posts about how covid isn’t real, and it’s nothing more than a collusion between pharmaceutical companies and corrupt politicians, you may begin to believe it and even go as far as to disobey things like the stay-at-home orders and break the law.

The use of bots started when people realized we can only process so much information at once. Attention Economics refers to the idea of one’s attention being a scarce commodity meaning we can only focus on information for so long and many economists see our economy today to be an attention economy. This concept had researchers from Indiana University pondering on how people would share information with limited attention. They ran simulations of this idea to try and understand this idea further.

What they found was that, even though most people actually want to both share and consume higher quality information, it is almost impossible for them to do that due to our inherent inability to consume the massive amount of news on our feeds. Because of our “lack of attention” per-say, this inevitably leads us to believe and share information that is either partly or completely false.

The article continued further and pointed towards more psychology focused studies that essentially highlighted the fact that humans adjust their understanding of new information so that it fits in with what we already know. What’s important from these studies in regard to us talking about bots and the spread of misinformation is that these studies acknowledge what is called confirmation bias: “People often seek out, recall and understand information that best confirms what they already believe” (Scientific American). Confirmation bias is essentially what keeps the spread of misinformation going. After someone is bombarded with misinformation from these bots and begin to believe it, they then will only seek out other information that agrees with what they already believe and will use that to try and spread their information to those around them.

Deeper into the article, we learn more about bots and how they “pollute” the internet with misinformation.

 

This figure shows two separate social networks where information is being shared. In these networks, pink nodes represent real accounts with real users while yellow nodes represent fake accounts managed by bots. The different shades of pink represent the quality of information being communicated. The darker the shade of pink, the worse the information is. As we can see, the network on the left with only a few yellow nodes (bots) has an overall higher rate of high-quality information being exchanged. This study specifies that when 1% or less of the network is associated with bots, there will be overall high-quality information. The network on the right is clearly full of bots, and we can see that the pink nodes are much darker indicating the quality of information is much lower. Essentially, these networks show us that just a few bots in a small starting network can lead to fake news going viral as that network grows.

The prevalence of bots has skyrocketed in the last decade or so due to the boom in popularity of social media. This article estimates that up to 15% of twitter accounts in 2017 were in fact bot accounts and that they played a substantial role in the 2016 presidential election. A main example of misinformation that was spread was the idea that Hillary Clinton and her campaign team was involved in “occult rituals” or essentially witchcraft and magic. When this fake information was posted, thousands of bots would like and retweet it which causes real people to do the same due to the apparent popularity of the posts. Consistent with what we have talked about in class, bots have become a powerful tool for ill-intentioned users to create fake information cascades to spread misinformation as well as act as the overall catalysts in the diffusion of online information.

 

 

 

 

Sources:

https://www.scientificamerican.com/article/information-overload-helps-fake-news-spread-and-social-media-knows-it/

All images are from the Scientific American article

 

Comments

Leave a Reply

Blogging Calendar

December 2020
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
28293031  

Archives