The Government Is Using the Most Vulnerable People to Test Facial Recognition Software
Publication/Creation DateMarch 17 2019
DescriptionIf you thought IBM using “quietly scraped” Flickr images to train facial recognition systems was bad, it gets worse. Our research, which will be reviewed for publication this summer, indicates that the U.S. government, researchers, and corporations have used images of immigrants, abused children, and dead people to test their facial recognition systems, all without consent. The very group the U.S. government has tasked with developing best practices and standards for the artificial intelligence industry, which includes facial recognition software and tools, is perhaps the worst offender when it comes to using images sourced without the knowledge of the people in the photographs.
, Human Rights
Date archivedMay 15 2019
Last editedMay 15 2019
How to cite this entry
Os Keyes, Nikki Stevens, Jacqueline Wernimont, Arizona State University. (March 17 2019). "The Government Is Using the Most Vulnerable People to Test Facial Recognition Software". Slate. The Slate Group LLC. Fabric of Digital Life. https://fabricofdigitallife.com/index.php/Detail/objects/3879