The tech industry doesn’t have a plan for dealing with bias in facial recognition
Publication/Creation DateJuly 26 2018
DescriptionFacial recognition is becoming part of the fabric of everyday life. You might already use it to log in to your phone or computer, or authenticate payments with your bank. In China, where the technology is more common, your face can be used to buy fast food, or claim your allowance of toilet paper at a public restroom. And this is to say nothing of how law enforcement agencies around the world are experimenting with facial recognition as tool of mass surveillance.
But the widespread uptake of this technology belies underlying structural problems, not least the issue of bias. By this, researchers mean that software used for facial identification, recognition, or analysis performs differently based on the age, gender, and ethnicity of the person it’s identifying.
, Critical Thinking
, Social Issues
, Public Space
, Peoples Of Color
Date archivedOctober 15 2018
Last editedDecember 12 2020