Skip to content

Instantly share code, notes, and snippets.

@saoun

saoun/index.md Secret

Last active November 29, 2018 05:59
Show Gist options
  • Save saoun/5465ce56b684f463d136ba0e863c3cf8 to your computer and use it in GitHub Desktop.
Save saoun/5465ce56b684f463d136ba0e863c3cf8 to your computer and use it in GitHub Desktop.
Diversity & Inclusion in Surveillance AI

Diversity & Inclusion in Surveillance AI

Description

Facial recognition and diversity have not gone together seamlessly. On the one hand, despite the popular rise of facial recognition technology over the past several years, we have witnessed the exclusion of people of color from the design and implementation process. Facial recognition tools are often discriminatory, and time after time, softwares either fail to recognize people of color or mislabel them. On the other hand, facial recognition technology is also one that is increasingly integrated into police and state surveillance tools. Addressing these tensions, this class will explore tools of surveillance and whom they target, the social and political implications of AI development, and the unintended effects that diversity and inclusion might have.

SFPC Code Societies 2018
Instructor: Sarah Aoun

Readings

Suggestion: best to read the 3 readings under Algorithmic accountability (and watch short video), and read a few in Surveillance Tech. The rest is (highly) suggested.

Algorithmic Accountability & Machine Bias

Facial Recognition Is Accurate, if You’re a White Guy

Video: How I'm fighting bias in algorithms

Against Black Inclusion in Facial Recognition

Machine Bias

History of Surveillance

Video: Race, Surveillance and Empire: A Historical Overview

Surveillance Technology

Facial recognition may be coming to a police body camera near you

The Next Frontier of Police Surveillance Is Drones

Palantir has secretly been using New Orleans to test its predictive policing technology

Facial recognition software is not ready for use by law enforcement

Taser will use police body camera videos to "anticipate criminal activity"

Racial profiling, by a computer? Police facial-ID tech raises civil rights concerns

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment