Skip to main content



Google Adsense Algorithm Leads to Discriminatory Selections

Ads can be helpful. When done well, directed ads can point a viewer in the right direction as they’re looking for a certain item online or at a store. Unfortunately, directed ads can also lead to unforeseen, and potentially damaging, circumstances. This was the case for a certain collection of ads displayed by the Google Adsense algorithm when provided certain search terms as researched by Latanya Sweeney.

Google uses an algorithm to rank pages as discussed in class. In this context, the ads serve as the pages and the ranking mechanism that is used is meant to optimize the possibility that a user will click on a given ad. Sweeney notes in her paper that when a user enters names that are traditionally given to black infants into the search field, there is a disproportionally high likelihood that an ad that suggests the individual has been subject to arrest is displayed at the top of the search results. This likelihood is far lower for most names that are traditionally given to white infants, indicating the presence of discrimination within the algorithm’s results.

This can be especially problematic if a potential employer searches up a candidate’s name and the search results gives the employer the impression that the candidate has been arrested. In order to alleviate this issue, Sweeney came up with a new way of calculating the quality score for these ads. By making minor adjustments to the quality score, Sweeney was able to prompt Google with a practical adjustment to their Adsense algorithm to prevent further discrimination within Adsense selections. Unfortunately, Google chose to not accept this adjustment and instead simply removed the discriminatory ads from their ad pool.

Ranking algorithms such as those used by Google typically lead to tailored results that serve to benefit the user. Unfortunately, there can be unforeseen issues with the data that these algorithms operate on that can lead to less than ideal situations. It’s our responsibility then, as future workers within the industry, to take precautionary steps to prevent these events from occurring in the first place. It is also our responsibility, once these algorithms are implemented, to keep a keen eye out on the results to make sure they don’t behave in ways that weren’t intended.

Citation:

Sweeney, Latanya. 2013. Discrimination in online ad delivery. Commun. ACM 56, 5 (May 2013), 44-54. DOI: https://doi.org/10.1145/2447976.2447990

Comments

Leave a Reply

Blogging Calendar

October 2019
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
28293031  

Archives