Weakness of PageRank
Since the algorithm for PageRank is something made public, I wondered whether it may be possible for spammers to create specific pages with the sole intention of ranking highly without high quality content just to get more hits (for increased ad revenue probably). After a quick search, it turns out that this indeed is the weakness of many algorithms! This seemed to be a major issue with earlier algorithms around 2011 or so, but as newer ones came out, this problem seemed more diminished. Perhaps this was done by a plagiarized material detector within the PageRank algorithm which identified the original source from the plagiarized ones to eliminate the spammed websites? The trend in the algorithms strengths and weaknesses suggests that Anyhow, it seems that the latest algorithms are not without their problems either. Possum for example seems to create competition within the region, meaning the results located in densely populate areas have to compete while results in rural areas have it much easier.