Skip to main content

False Positives in Facial Recognition


In an article written in May 2018, it states that the South Wales Police ran a facial recognition software against a sports game to seek out persons of interests since sports events are known for persons of interest to congregate. The program found 2470 potential matches, of which only 173 of those matches were found to be true positives. This means that the program had a false positive rate of 92%.

What does this mean?

According to the Bayes Rule, this means that if a person were be identified by the program, then there are a 92% chance that the identification was incorrect. Similarly, this also means that if a person wasn’t identified by the program, then there is a high chance that the person was a false negative.

Despite these statistics, the South Wales Police announced that the numbers showed that facial recognition technology was moving in the right direction. However, with such a high false positive rate, it’s certain that most people would not be comfortable with allowing this technology to become mainstream. Even after a few months, there’s no doubt that the facial recognition technology would’ve improved from May; however, there is also no doubt that our current facial recognition technology contains many bias, especially race.

It may seem like implementation of facial recognition technology is in far future, but when you look at the surveillance technology used in China to track the actions of every citizen and form a citizenship goodness rating, then the implementation of facial recognition doesn’t seem quite far fetched anymore. With the massive inaccuracies in mind, it’s important that our society starts looking into how companies are shaping facial recognition before it is too late.


Leave a Reply

Blogging Calendar

November 2018
« Oct   Dec »