Skip to main content



Stereotypes, Search Results, and the Individual

Article: https://hbr.org/2016/10/new-evidence-shows-search-engines-reinforce-social-stereotypes

A search engine will return results relevant to your query based off previous search results used by other people in related queries. When googling ‘doctors’ and ‘nurses’, researchers notice that the observed gender distributions based off returned images are often very unrepresentative of actual distributions recorded by the U.S. Bureau of Labor Statistics. This phenomenon is known as search engine bias, where the search engines exaggerate social stereotypes. The article goes on to describe two types of behaviors: warmth and agency. Warm actions are pro-social, and agentic actions are those that demonstrate competency. These two behaviors show how stereotypes are formed. Keeping the example of gender bias in mind, the researchers then noted how traditional male stereotypes are built around agency, more than warmth, and further explained that deviating from these perceptions leads to ‘economic or social’ backlash. They tested these societal stereotypes by searching for images of people exemplifying warm of agentic traits, such as “emotional” or “intelligent”, and noted that the results confirmed the previous assertions. The article finishes by proposing additions to searching algorithms that would take bias into account and try to mitigate it, and by reminding the reader to take bias into account when reading search results.

 

This pertains to what was discussed in class regarding searching algorithms. Specifically, by relating it to Hubs and Authorities. With hubs and authorities, the relevance of a particular authority to a search query is based recursively upon the relevance of its linked hub, and again on the relevance of the authorities linked to the hub. Social Stereotypes exist outside of the internet, so individuals who accept these stereotypes are more likely to create content and search for confirmation of these biases when interacting with search results. To draw a parallel with hubs and authorities again, the popularity of a stereotype could be said to be proportional to the initial indegree to a particular result. Overtime, as a popular result is deemed more and more relevant by a searching algorithm, other results will be deemed non-relevant and may not be seen by an individual using a search engine. Search engine ‘bias’ as the article calls it, seems to stem from already existing social stereotypes that self-reinforces itself as a relevant topic as more people click on it. As to the researchers’ suggestion, that additional algorithms be implemented to mitigate bias, it is important to remember that it would be hard for any algorithm to differentiate between bias and relevant results, if it still want to use that functionality to return relevant results based off popularity. Using popularity to additionally filter relevant results from the rest is an important aspect of searching algorithms, so therefore changing the algorithm is not an optimal solution. To quote the article, “The ways to address them(biases) may look little like the ways we address the biases in ourselves”.

Comments

Leave a Reply

Blogging Calendar

October 2016
M T W T F S S
 12
3456789
10111213141516
17181920212223
24252627282930
31  

Archives