Skip to main content

Biases in Algorithms

In class we have recently discussed how the search algorithm for Google works. From the very basic material that we learned about the algorithm, it seems like the algorithm is resistant to failure due to its very systematic way of organizing websites. However, after considering how it works, is it possible that the algorithm is flawed? More specifically, how so from a social perspective?

Well, as it turns out, many algorithms are indeed flawed- including the search algorithm. The reason being is that algorithms are ultimately coded by individuals who inherently have biases. And although there continues to be a push for the promotion of people of color in STEM fields, the reality at the moment is that the majority of people in charge of designing algorithms are White males. Given that White males, have certain privileges and biases, the inevitable thing that happens is that algorithms are in a way designed for a White audience. For instance, in the chosen article, it says “Searching for images of ‘professor’ will produce pictures of white males … but to find representations of women or people of color, the search algorithm requires the user to include ‘woman professor’ or ‘Latina professor,’ which reinforces the belief that a ‘real’ professor is white and male”. Why do people of color have to endure additional labor to find accurate representation of who they are?

Jenny Korn, a race and media scholar at the University of Illinois at Chicago, mentioned that the “discussion of algorithms should be tied to the programmers programming those algorithms. Algorithms reflect human creations of normative values around race, gender and other areas related to social justice”. Therefore, as companies continue to implement very stable and systematic algorithms, such as the one discussed in class, they should take into consideration the diverse audience that they cater to. One way of doing this is by providing their employees with diversity training, to help them learn about their inherent biases.

In addition, it is crucial that companies place high importance on the demographics of their coders now. As machine learning continues to increase in popularity, companies need to make sure that machines learn from code that has been written by people of color and not just White Males. Ignoring this issue furthermore, might eventually pose the problem of potentially having search results only cater to a specific audience.

Ultimately, the search algorithm that we learned in class is quite amazing. And although Networks 2040 might not be the most appropriate class to discuss the implicit biases of coders, it is still important for students in Networks 2040 to know that eve the technological world has huge social impact.


Leave a Reply

Blogging Calendar

October 2017
« Sep   Nov »