Skip to main content



Using PageRank to Detect Fraudulent Websites

http://www.computerworld.in/feature/can-algorithms-really-tackle-fake-news-fiasco

 

In the wake of the 2016 Presidential Election, the idea of “Fake News” has become a widespread and recognized phenomenon. Many news podcasts and websites, without any reliable or credible sources of information, will report on a fraudulent story that those who hear about will accept as fact. As a result, to better be able to distinguish between reliable and fraudulent news websites, this article reports on the ability of Google’s PageRank to determine the reliability of certain authorities based upon the number of websites that link to it. As was discussed in class, PageRank ranks authorities – in this case, news websites – with an algorithm that counts the number of websites that link to them, while also factoring in how good those websites are at linking to other reputable news authorities.

The article also discusses how Facebook, like Google, has an algorithm to detect the legitimacy of various news articles, but that their algorithm has failed to detect fabricated stories before. While it may be easy for humans to determine whether a news article is factual or not, through methods such as seeing if the article has any sources, or checking if commenters on the article believe it to be fabricated, such techniques can be hard for an algorithm to perform. Also, although PageRank is an ingenious way to quantify the trustworthiness of news websites in an unintuitive way, it also has flaws (it was even recently removed).

In the near future, algorithms might be able to reliably attach a trustworthiness rating to news sites. However, currently, even the leaders in algorithmic research and technology, such as Google and Facebook, have difficulty ascertaining whether certain news articles are factual or fabricated.

Comments

Leave a Reply

Blogging Calendar

October 2017
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

Archives