Sorry. Content could not be displayed!!

ID: Talk - "Talking about Bias in Face Recognition - How about Some Facts?"

  • Room: ID: Talks Theater (Exhibit Hall)
Tuesday,September 24, 2019:1:10 PM -1:25 PM

Speaker(s)

Speaker
Yevgeniy Sirotin
Scientist Manager, Identity and Data Sciences Lab
Maryland Test Facility, SAIC

Description

ID:Talks are bonus presentations in the exhibit hall and are not components of the formal agenda produced by the FedID Planning Committee.

There is increasing public attention regarding the potential negative impacts of face recognition technology. Current issues with face recognition include concerns regarding how well the technology works for different demographic groups. As part of our work testing biometric technologies for the Department of Homeland Security Science and Technology Directorate (DHS S&T), our research group has investigated this topic. We do find that there are some gaps in the performance of face recognition technology, however, many current media narratives, including those referencing our work on demographics, have misleading undertones. Most organizations recognize that there are benefits and risks associated with new technologies. It is, therefore, important to outline what these are so that they can be mitigated. In this presentation, I will discuss some undertones emerging from media narratives about bias in face recognition, identify their origins, and provide appropriate facts, from our group’s work and from the work of other groups, that contextualize and help inform how to discuss these issues with the public. Specifically I will provide facts addressing the following statements:

  1. Face recognition makes errors.
  2. Face recognition is only 66% accurate for women of color.
  3. Face recognition algorithms are the only source of bias in operational systems.
  4. Face recognition always fails for a specific demographic group.
  5. Face recognition is just like other biometric modalities.