The Government Is Using the Most Vulnerable People to Test Facial Recognition Software


Publication Title
Slate
Publication/Creation Date
March 17 2019
Persuasive Intent
Information
Description
If you thought IBM using “quietly scraped” Flickr images to train facial recognition systems was bad, it gets worse. Our research, which will be reviewed for publication this summer, indicates that the U.S. government, researchers, and corporations have used images of immigrants, abused children, and dead people to test their facial recognition systems, all without consent. The very group the U.S. government has tasked with developing best practices and standards for the artificial intelligence industry, which includes facial recognition software and tools, is perhaps the worst offender when it comes to using images sourced without the knowledge of the people in the photographs.
HCI Platform
Other
Location on Body
Not On The Body
Marketing Keywords
Source
https://slate.com/technology/2019/03/facial-recognition-nist-verification-testing-data-sets-children-immigrants-consent.html

Date archived
May 15 2019
Last edited
May 15 2019
How to cite this entry
Os Keyes, Nikki Stevens, Jacqueline Wernimont, Arizona State University. (March 17 2019). "The Government Is Using the Most Vulnerable People to Test Facial Recognition Software". Slate. The Slate Group LLC. Fabric of Digital Life. https://fabricofdigitallife.com/index.php/Detail/objects/3879