Face the bias in facial recognition software. 

Get daily actions in your inbox. Subscribe Now ›

Happy Friday,

I'm fascinated with the intersection of equity and technology, so I've been following today's topic for a few years. It illuminates how racism thrives not just in the systems we build, but the technology we build, too. And even if these technologies were completely unbiased, we can still wield them as tools or weapons to pursue misguided agendas.

Today looks at another example of harmful policing and the political implications, but I hope it also encourages you to take a second look at your smartphone, your laptop, and your favorite apps, and consider: who was this built for? How does it help or hurt this movement?

Tomorrow will be a Q&A email so send in your requests. And your support makes these email possible! You can contribute 
one-time on Paypal or Venmo (@nicoleacardoza) or give monthly on Patreon to keep these going.

- Nicole

Share | Tweet | Forward

TAKE ACTION


  1. Sign the petition banning law enforcement from using facial recognition. (The bill referenced on the petition is this one, which I discuss at the end of this email).

  2. Learn how facial recognition is being used in your local community. Here is a map with countless examples across the U.S.

GET EDUCATED


Let's face the facts: Facial recognition software is biased.
 

Facial recognition software has been struggling to save face for a while. So it wasn't a good look that when week – in the midst of the protests, no less – the ACLU accused the Detroit police office with what they're calling the first known wrongful arrest involving facial-recognition technology. 

Robert Williams was arrested in his driveway and detained for 30 hours under suspicion of theft. Images of the suspect, stealing from a watch store in downtown Detroit, were run through facial recognition software, and Robert Williams was a match. It wasn't until officers interrogated him that they realized his faces didn't match the pictures – at all. Read more on CNN Business >

It might not be surprising to know that Robert Williams is Black. If you've been following the facial recognition conversation over the past few years, you might have guessed from the beginning. Because there have been dozens of studies that show that facial recognition software can be disproportionately inaccurate when it tries to identify Black people and other people of color.

Joy Buolamwini, a researcher at the M.I.T. Media Lab and founder of the Algorithmic Justice League, published one of the first comprehensive studies on facial recognition bias in 2018 after her firsthand experience (more via the NYTimes). The study found that software was much more likely to misidentify the gender of Black women than white men. Her work, including her popular Ted Talk, paved the way for larger discussion on the flaws of facial recognition.

“Facial recognition is increasingly penetrating our lives, but there is still time to prevent it from worsening social inequalities. To do that, we must face the coded gaze.”

Joy Buolamwini in her op-ed for the NYTimes


More reports were quick to follow, include one from the National Institute of Standards and Technology that found that African American people were 10 to 100 times more likely to be misidentified than Caucasians, and the Native American population had the highest error rates. (Full study on the NIST website). It also found that women were more likely to be misidentified than men. The elderly and children were more likely to be misidentified than those in other age groups. Middle-aged white men generally benefited from the highest accuracy rates (via Washington Post). Another study by UC Boulder found that facial analysis services are also "universally unable to classify non-binary genders" (Eureka Alert).

A main reason for these discrepancies is that facial recognition software can only be as smart as the data that feeds it. And most facial recognition training data sets are estimated to be more than 75% male and more than 80% white (Brookings). Unsurprisingly, the lack of diversity in tech also means that there are few women or people of color that are on the teams building this software, and increasing representation is likely to create a more equitable product (USA Today).

Have you tried opening your iPhone while wearing a face mask, and have it not work? That type of facial recognition error is simply a slight annoyance. But consider its application in policing, especially knowing the systemic racism persistent without the use of technology. And then consider that other algorithms used in criminal justice are also biased, like this algorithm that tries to assess the likelihood of future crimes (via ProPublica). I don't think we need another way to discriminate against those systemically marginalized. More on the dangers of policing in this article in The Week >

And its applications extend beyond just dangerous policing to nearly everything we do. It's being used to monitor fans at concerts (the Guardian), authorize us at the airport (Medium), and even as security in schools (Edweek). It's also not just a tool, but a weapon: the stories of the Chinese government using advanced facial recognition technology to track and control the Uighurs, a Muslim minority, is bonechilling (NYTimes).

Even if you haven't seen the news around facial recognition software, it's likely seen you: over half of all Americans are in a law enforcement face recognition network (via Georgetown Law). So the next time the police run a grainy photo of a suspect in a robbery, they could arrest you in their place.

The Facial Recognition and Biometric Technology Moratorium Act of 2020, a federal bill announced yesterday, is a step to introduce federal regulation to ensure the safety of everyone, particularly those systemically marginalized, even going so far as divesting funds from law enforcement that uses it inappropriately. Read more on The Verge >


“Facial-recognition technology doesn’t just pose a grave threat to our privacy, it physically endangers Black Americans and other minority populations in our country. As we work to dismantle the systematic racism that permeates every part of our society, we can’t ignore the harms that these technologies present.”

– Oregon Sen. Edward Markey via Fortune Magazine


Thank you for all your financial contributions! If you haven't already, consider making a monthly donation to this work. These funds will help me operationalize this work for greatest impact.

Subscribe on Patreon Give one-time on PayPal | Venmo @nicoleacardoza

Previous
Previous

Reflect on these questions from the past week. 

Next
Next

Capitalize B in Black and I in Indigenous.